Estimating storage and query costs

On-demand queries are charged based on the number of bytes read. For current on-demand query pricing, see the Pricing page.

To estimate costs before running a query use the:

  • Query validator in the Cloud Console or the classic BigQuery web UI
  • --dry_run flag in the bq command-line tool
  • dryRun parameter when submitting a query job using the API
  • Google Cloud Pricing Calculator
  • Client libraries

Estimating query costs

To estimate query costs:

Console

When you enter a query in the Cloud Console, the query validator verifies the query syntax and provides an estimate of the number of bytes read. You can use this estimate to calculate query cost in the Pricing Calculator.

Query validator

Classic UI

When you enter a query in the Cloud Console or the classic BigQuery web UI, the query validator verifies the query syntax and provides an estimate of the number of bytes read. You can use this estimate to calculate query cost in the Pricing Calculator.

Query validator

bq

When you run a query in the bq command-line tool, you can use the --dry_run flag to estimate the number of bytes read. You can use this estimate to calculate query cost in the Pricing Calculator.

A bq tool query that uses the --dry_run flag looks like the following:

bq query \
--use_legacy_sql=false \
--dry_run \
'SELECT
  column1,
  column2,
  column3
FROM
  `project_id.dataset.table`
LIMIT
  1000'

When you run the command, the response contains the estimated bytes read: Query successfully validated. Assuming the tables are not modified, running this query will process 10918 bytes of data.

API

To perform a dry run by using the API, submit a query job with dryRun set to true.

go

Before trying this sample, follow the Go setup instructions in the BigQuery Quickstart Using Client Libraries. For more information, see the BigQuery Go API reference documentation.

import (
	"context"
	"fmt"
	"io"

	"cloud.google.com/go/bigquery"
)

// queryDryRun demonstrates issuing a dry run query to validate query structure and
// provide an estimate of the bytes scanned.
func queryDryRun(w io.Writer, projectID string) error {
	// projectID := "my-project-id"
	ctx := context.Background()
	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	q := client.Query(`
	SELECT
		name,
		COUNT(*) as name_count
	FROM ` + "`bigquery-public-data.usa_names.usa_1910_2013`" + `
	WHERE state = 'WA'
	GROUP BY name`)
	q.DryRun = true
	// Location must match that of the dataset(s) referenced in the query.
	q.Location = "US"

	job, err := q.Run(ctx)
	if err != nil {
		return err
	}
	// Dry run is not asynchronous, so get the latest status and statistics.
	status := job.LastStatus()
	if err != nil {
		return err
	}
	fmt.Fprintf(w, "This query will process %d bytes\n", status.Statistics.TotalBytesProcessed)
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery Quickstart Using Client Libraries. For more information, see the BigQuery Java API reference documentation.

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.JobStatistics;
import com.google.cloud.bigquery.QueryJobConfiguration;

// Sample to run dry query on the table
public class QueryDryRun {

  public static void runQueryDryRun() {
    String query =
        "SELECT name, COUNT(*) as name_count "
            + "FROM `bigquery-public-data.usa_names.usa_1910_2013` "
            + "WHERE state = 'WA' "
            + "GROUP BY name";
    queryDryRun(query);
  }

  public static void queryDryRun(String query) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      QueryJobConfiguration queryConfig =
          QueryJobConfiguration.newBuilder(query).setDryRun(true).setUseQueryCache(false).build();

      Job job = bigquery.create(JobInfo.of(queryConfig));
      JobStatistics.QueryStatistics statistics = job.getStatistics();

      System.out.println(
          "Query dry run performed successfully." + statistics.getTotalBytesProcessed());
    } catch (BigQueryException e) {
      System.out.println("Query not performed \n" + e.toString());
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the BigQuery Quickstart Using Client Libraries. For more information, see the BigQuery Node.js API reference documentation.

// Import the Google Cloud client library
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();

async function queryDryRun() {
  // Runs a dry query of the U.S. given names dataset for the state of Texas.

  const query = `SELECT name
    FROM \`bigquery-public-data.usa_names.usa_1910_2013\`
    WHERE state = 'TX'
    LIMIT 100`;

  // For all options, see https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query
  const options = {
    query: query,
    // Location must match that of the dataset(s) referenced in the query.
    location: 'US',
    dryRun: true,
  };

  // Run the query as a job
  const [job] = await bigquery.createQueryJob(options);

  // Print the status and statistics
  console.log('Status:');
  console.log(job.metadata.status);
  console.log('\nJob Statistics:');
  console.log(job.metadata.statistics);
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery Quickstart Using Client Libraries. For more information, see the BigQuery Python API reference documentation.

To perform a dry run using the Python client library, set the QueryJobConfig.dry_run property to True. Client.query() always returns a completed QueryJob when provided a dry run query configuration.
from google.cloud import bigquery

# Construct a BigQuery client object.
client = bigquery.Client()

job_config = bigquery.QueryJobConfig(dry_run=True, use_query_cache=False)

# Start the query, passing in the extra configuration.
query_job = client.query(
    (
        "SELECT name, COUNT(*) as name_count "
        "FROM `bigquery-public-data.usa_names.usa_1910_2013` "
        "WHERE state = 'WA' "
        "GROUP BY name"
    ),
    job_config=job_config,
)  # Make an API request.

# A dry run query completes immediately.
print("This query will process {} bytes.".format(query_job.total_bytes_processed))

Estimating query costs using the Google Cloud Pricing Calculator

To estimate on-demand query costs in the Google Cloud Pricing Calculator, enter the number of bytes that are processed by the query as B, KB, MB, GB, TB, or PB. If your query processes less than 1 TB, the estimate is $0 because BigQuery provides 1 TB of on-demand query processing free per month.

Pricing calculator

To estimate the cost of a query using the pricing calculator:

  1. Open the Google Cloud Pricing Calculator,
  2. Click BigQuery.
  3. Click the On-Demand tab.
  4. For Table Name, type the name of the table. For example, airports.
  5. For Storage Pricing, enter 0 in the Storage field.
  6. For Query Pricing, enter the estimated bytes read from your dry run or the query validator. Calculator
  7. Click Add To Estimate.
  8. The estimate appears to the right. Notice that you can save or email the estimate. On-demand calculator

In this case, the number of bytes read by the query is below the 1 TB of on-demand processing provided via the free tier. As a result, the estimated cost is $0.

Including flat-rate pricing in the Pricing Calculator

If you have flat-rate pricing applied to your billing account, you can click the Flat-Rate tab, choose your flat-rate plan, and add your storage costs to the estimate.

Flat-rate calculator

For more information, see Flat-rate pricing.

Estimating storage costs using the Google Cloud Pricing Calculator

To estimate storage costs in the Google Cloud Pricing Calculator, enter the number of bytes that are stored as B, KB, MB, GB, TB, or PB. BigQuery provides 10 GB of storage free per month.

To estimate storage costs using the pricing calculator:

  1. Open the Google Cloud Pricing Calculator.
  2. Click BigQuery.
  3. Click the On-Demand tab.
  4. For Table Name, type the name of the table. For example, airports.
  5. For Storage Pricing, enter 100 in the Storage field. Leave the measure set to GB.
  6. Click Add To Estimate.
  7. The estimate appears to the right. Notice that you can save or email the estimate. Pricing calculator