Stay organized with collections Save and categorize content based on your preferences.

Quickstart: Using client libraries

This page shows you how to get started with the BigQuery API in your favorite programming language.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Enable the BigQuery API.

    Enable the API

  4. Create a service account:

    1. In the Google Cloud console, go to the Create service account page.

      Go to Create service account
    2. Select your project.
    3. In the Service account name field, enter a name. The Google Cloud console fills in the Service account ID field based on this name.

      In the Service account description field, enter a description. For example, Service account for quickstart.

    4. Click Create and continue.
    5. To provide access to your project, grant the Project > Owner role to your service account.

      To grant the role, find the Select a role list, then select Project > Owner.

    6. Click Continue.
    7. Click Done to finish creating the service account.

      Do not close your browser window. You will use it in the next step.

  5. Create a service account key:

    1. In the Google Cloud console, click the email address for the service account that you created.
    2. Click Keys.
    3. Click Add key, and then click Create new key.
    4. Click Create. A JSON key file is downloaded to your computer.
    5. Click Close.
  6. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the JSON file that contains your service account key. This variable only applies to your current shell session, so if you open a new session, set the variable again.

  7. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  8. Enable the BigQuery API.

    Enable the API

  9. Create a service account:

    1. In the Google Cloud console, go to the Create service account page.

      Go to Create service account
    2. Select your project.
    3. In the Service account name field, enter a name. The Google Cloud console fills in the Service account ID field based on this name.

      In the Service account description field, enter a description. For example, Service account for quickstart.

    4. Click Create and continue.
    5. To provide access to your project, grant the Project > Owner role to your service account.

      To grant the role, find the Select a role list, then select Project > Owner.

    6. Click Continue.
    7. Click Done to finish creating the service account.

      Do not close your browser window. You will use it in the next step.

  10. Create a service account key:

    1. In the Google Cloud console, click the email address for the service account that you created.
    2. Click Keys.
    3. Click Add key, and then click Create new key.
    4. Click Create. A JSON key file is downloaded to your computer.
    5. Click Close.
  11. Set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the JSON file that contains your service account key. This variable only applies to your current shell session, so if you open a new session, set the variable again.

Install the client library

C#

For more on setting up your C# development environment, refer to the C# Development Environment Setup Guide.

Install-Package Google.Cloud.BigQuery.V2 -Pre

Go

go mod init YOUR_MODULE_NAME
go get cloud.google.com/go/bigquery

Java

For more on setting up your Java development environment, refer to the Java Development Environment Setup Guide.

If you are using Maven, add the following to your pom.xml file. For more information about BOMs, see The Google Cloud Platform Libraries BOM.

<!--  Using libraries-bom to manage versions.
See https://github.com/GoogleCloudPlatform/cloud-opensource-java/wiki/The-Google-Cloud-Platform-Libraries-BOM -->
<dependencyManagement>
  <dependencies>
    <dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>libraries-bom</artifactId>
      <version>26.11.0</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencyManagement>

<dependencies>
  <dependency>
    <groupId>com.google.cloud</groupId>
    <artifactId>google-cloud-bigquery</artifactId>
  </dependency>
</dependencies>

If you are using Gradle, add the following to your dependencies:

implementation platform('com.google.cloud:libraries-bom:26.11.0')

implementation 'com.google.cloud:google-cloud-bigquery'

If you are using sbt, add the following to your dependencies:

libraryDependencies += "com.google.cloud" % "google-cloud-bigquery" % "2.24.3"

If you're using Visual Studio Code, IntelliJ, or Eclipse, you can add client libraries to your project using the following IDE plugins:

The plugins provide additional functionality, such as key management for service accounts. Refer to each plugin's documentation for details.

Node.js

For more on setting up your Node.js development environment, refer to the Node.js Development Environment Setup Guide.

npm install --save @google-cloud/bigquery

PHP

composer require google/cloud-bigquery

Python

For more on setting up your Python development environment, refer to the Python Development Environment Setup Guide.

pip install --upgrade google-cloud-bigquery

Ruby

For more on setting up your Ruby development environment, refer to the Ruby Development Environment Setup Guide.

gem install google-cloud-bigquery

Import the libraries

C#

For more information, see the BigQuery C# API reference documentation.


using System;
using Google.Cloud.BigQuery.V2;

Go

For more information, see the BigQuery Go API reference documentation.

import (
	"context"
	"fmt"
	"io"
	"log"
	"os"

	"cloud.google.com/go/bigquery"
	"google.golang.org/api/iterator"
)

Java

For more information, see the BigQuery Java API reference documentation.


import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.FieldValueList;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.QueryJobConfiguration;
import com.google.cloud.bigquery.TableResult;
import java.util.UUID;

Node.js

For more information, see the BigQuery Node.js API reference documentation.

// Import the Google Cloud client library
const {BigQuery} = require('@google-cloud/bigquery');

PHP

For more information, see the BigQuery PHP API reference documentation.

use Google\Cloud\BigQuery\BigQueryClient;

Python

For more information, see the BigQuery Python API reference documentation.

from google.cloud import bigquery

Ruby

For more information, see the BigQuery Ruby API reference documentation.

require "google/cloud/bigquery"

Initialize a BigQuery client

Initialize a client to authenticate and connect to the BigQuery API.

C#

Use the BigQueryClient.Create() function to create the BigQuery client.

string projectId = "YOUR-PROJECT-ID";
var client = BigQueryClient.Create(projectId);

Go

Use the bigquery.NewClient() function to create the BigQuery client.

ctx := context.Background()

client, err := bigquery.NewClient(ctx, projectID)
if err != nil {
	log.Fatalf("bigquery.NewClient: %v", err)
}
defer client.Close()

Java

Use the BigQueryOptions.getDefaultInstance() function to use the default authentication options. Use the BigQueryOptions.getService() function to create the BigQuery client.

BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

Node.js

Instantiate the BigQuery class to create the BigQuery client.

// Create a client
const bigqueryClient = new BigQuery();

PHP

Instantiate the BigQueryClient class to create the BigQuery client.

$bigQuery = new BigQueryClient();

Python

Instantiate the bigquery.Client class to create the BigQuery client.

client = bigquery.Client()

Ruby

Use the Google::Cloud::Bigquery.new function to create the BigQuery client.

# This uses Application Default Credentials to authenticate.
# @see https://cloud.google.com/bigquery/docs/authentication/getting-started
bigquery = Google::Cloud::Bigquery.new

Query a dataset

The following query retrieves the most-viewed questions tagged with google-bigquery from the Stack Overflow public dataset.

SELECT
  CONCAT(
    'https://stackoverflow.com/questions/',
    CAST(id as STRING)) as url,
  view_count
FROM `bigquery-public-data.stackoverflow.posts_questions`
WHERE tags like '%google-bigquery%'
ORDER BY view_count DESC
LIMIT 10

This query uses GoogleSQL syntax. The client libraries default to GoogleSQL syntax. To change the SQL dialect, see BigQuery SQL dialects.

Run the query

Run the following query using the authenticated BigQuery client.

C#

Define a query string and use the client.ExecuteQuery() function to submit the query and get the results.

string query = @"SELECT
    CONCAT(
        'https://stackoverflow.com/questions/',
        CAST(id as STRING)) as url, view_count
    FROM `bigquery-public-data.stackoverflow.posts_questions`
    WHERE tags like '%google-bigquery%'
    ORDER BY view_count DESC
    LIMIT 10";
var result = client.ExecuteQuery(query, parameters: null);

Go

Use the bigquery.Query() function to define a query and Query.Read() function to submit the query and get the results.

query := client.Query(
	`SELECT
		CONCAT(
			'https://stackoverflow.com/questions/',
			CAST(id as STRING)) as url,
		view_count
	FROM ` + "`bigquery-public-data.stackoverflow.posts_questions`" + `
	WHERE tags like '%google-bigquery%'
	ORDER BY view_count DESC
	LIMIT 10;`)
return query.Read(ctx)

Java

Define the query with a QueryJobConfiguration instance. Start the query job with the BigQuery.create() method.

QueryJobConfiguration queryConfig =
    QueryJobConfiguration.newBuilder(
            "SELECT CONCAT('https://stackoverflow.com/questions/', "
                + "CAST(id as STRING)) as url, view_count "
                + "FROM `bigquery-public-data.stackoverflow.posts_questions` "
                + "WHERE tags like '%google-bigquery%' "
                + "ORDER BY view_count DESC "
                + "LIMIT 10")
        // Use standard SQL syntax for queries.
        // See: https://cloud.google.com/bigquery/sql-reference/
        .setUseLegacySql(false)
        .build();

// Create a job ID so that we can safely retry.
JobId jobId = JobId.of(UUID.randomUUID().toString());
Job queryJob = bigquery.create(JobInfo.newBuilder(queryConfig).setJobId(jobId).build());

// Wait for the query to complete.
queryJob = queryJob.waitFor();

// Check for errors
if (queryJob == null) {
  throw new RuntimeException("Job no longer exists");
} else if (queryJob.getStatus().getError() != null) {
  // You can also look at queryJob.getStatus().getExecutionErrors() for all
  // errors, not just the latest one.
  throw new RuntimeException(queryJob.getStatus().getError().toString());
}

Node.js

Use the BigQuery.query() method to start the query.

// The SQL query to run
const sqlQuery = `SELECT
  CONCAT(
    'https://stackoverflow.com/questions/',
    CAST(id as STRING)) as url,
  view_count
  FROM \`bigquery-public-data.stackoverflow.posts_questions\`
  WHERE tags like '%google-bigquery%'
  ORDER BY view_count DESC
  LIMIT 10`;

const options = {
  query: sqlQuery,
  // Location must match that of the dataset(s) referenced in the query.
  location: 'US',
};

// Run the query
const [rows] = await bigqueryClient.query(options);

PHP

Create a query configuration and use the BigQueryClient.startQuery() method to start the query.

$query = <<<ENDSQL
SELECT
  CONCAT(
    'https://stackoverflow.com/questions/',
    CAST(id as STRING)) as url,
  view_count
FROM `bigquery-public-data.stackoverflow.posts_questions`
WHERE tags like '%google-bigquery%'
ORDER BY view_count DESC
LIMIT 10;
ENDSQL;
$queryJobConfig = $bigQuery->query($query);
$queryResults = $bigQuery->runQuery($queryJobConfig);

Python

Use the Client.query() method to start the query.

query_job = client.query(
    """
    SELECT
      CONCAT(
        'https://stackoverflow.com/questions/',
        CAST(id as STRING)) as url,
      view_count
    FROM `bigquery-public-data.stackoverflow.posts_questions`
    WHERE tags like '%google-bigquery%'
    ORDER BY view_count DESC
    LIMIT 10"""
)

results = query_job.result()  # Waits for job to complete.

Ruby

Use the Google::Cloud::Bigquery::Project.query function to start a query and wait for the results.

sql     = "SELECT " \
          "CONCAT('https://stackoverflow.com/questions/', CAST(id as STRING)) as url, view_count " \
          "FROM `bigquery-public-data.stackoverflow.posts_questions` " \
          "WHERE tags like '%google-bigquery%' " \
          "ORDER BY view_count DESC LIMIT 10"
results = bigquery.query sql

Learn more about queries:

Display the query result

Display the query results.

C#

Console.Write("\nQuery Results:\n------------\n");
foreach (var row in result)
{
    Console.WriteLine($"{row["url"]}: {row["view_count"]} views");
}

Go

Use the RowIterator.Next() function to load each row into a struct pointer.