모든 BigQuery 코드 샘플

BigQuery 코드 샘플

This page contains code samples for BigQuery. To search and filter code samples for other Google Cloud products, see the Google Cloud sample browser.

Add a column using a load job

Add a new column to a BigQuery table while appending rows using a load job.

View in documentation

Add a column using a query job

Add a new column to a BigQuery table while appending rows using a query job with an explicit destination table.

View in documentation

Add a label

Add a label to a table.

Add an empty column

Manually add an empty column.

View in documentation

Array parameters

Run a query with array parameters.

View in documentation

Authorize a BigQuery Dataset

Authorize a BigQuery Dataset

Cancel a job

Attempt to cancel a job.

View in documentation

Check dataset existence

A function to check whether a dataset exists.

View in documentation

Clustered table

Load data from a CSV file on Cloud Storage to a clustered table.

View in documentation

Column-based time partitioning

Create a table that uses column-based time partitioning.

View in documentation

Copy a single-source table

Copy a single-source table to a given destination.

View in documentation

Copy a table

Copy a table with customer-managed encryption keys.

View in documentation

Copy multiple tables

Copy multiple source tables to a given destination.

View in documentation

Create a client with a service account key file

Create a BigQuery client using a service account key file.

Create a client with application default credentials

Create a BigQuery client using application default credentials.

View in documentation

Create a clustered table

Create a clustered table.

View in documentation

Create a dataset in BigQuery.

Sample demonstrating how to create a dataset in BigQuery. Give the dataset a unique name. Note, that the geographic location for the location can not be changed once the dataset is created.

Create a dataset with a customer-managed encryption key

The following example creates a dataset named `mydataset`, and also uses the `google_kms_crypto_key` and `google_kms_key_ring` resources to specify a Cloud Key Management Service key for the dataset. You must enable the Cloud Key Management Service API before running this example.

View in documentation

Create a job

Run a BigQuery job (query, load, extract, or copy) in a specified location with additional configuration.

View in documentation

Create a model

Create a model within an existing dataset.

Create a routine

Create a routine within an existing dataset.

Create a routine with DDL

Create a routine using a DDL query.

Create a table

Create a table with customer-managed encryption keys.

View in documentation

Create a table using a template

Create a table using the properties of one table (schema, partitioning, clustering) to create a new empty table with the same configuration.

Create a view

Create a view within a dataset.

View in documentation

Create a view with DDL

Create a view using a DDL query.

Create an authorized view

Create an authorized view using GitHub public data.

View in documentation

Create an integer-range partitioned table

Create a new integer-range partitioned table in an existing dataset.

View in documentation

Create credentials with scopes

Create credentials with Drive and BigQuery API scopes.

View in documentation

Create external table with hive partitioning

Create an external table using hive partitioning.

Create IAM policy

Create an IAM policy for a table.

View in documentation

Create materialized view

Create a materialized view.

View in documentation

Create table with schema

Create a table with a schema.

Delete a dataset

Delete a dataset from a project.

View in documentation

Delete a dataset and its contents

Delete a dataset and its contents from a project.

Delete a label from a dataset

Remove a label from a dataset.

View in documentation

Delete a label from a table

Remove a label from a table.

View in documentation

Delete a model

Delete a model from a dataset.

View in documentation

Delete a routine

Delete a routine from a dataset.

Delete a table

Delete a table from a dataset.

View in documentation

Delete materialized view

Delete a materialized view.

View in documentation

Disable query cache

Query disables the use of the cache.

View in documentation

Download public table data to DataFrame

Use the BigQuery Storage API to speed up downloads of large tables to DataFrame.

Download public table data to DataFrame from the sandbox

Use the BigQuery Storage API to download query results to DataFrame.

Download query results to a GeoPandas GeoDataFrame

Download query results to a GeoPandas GeoDataFrame.

Download query results to DataFrame

Get query results as a Pandas DataFrame.

Download table data to DataFrame

Download table data to a Pandas DataFrame.

Enable large results

Query enables large result sets using legacy SQL.

View in documentation

Export a model

Export an existing model to an existing Cloud Storage bucket.

View in documentation

Export a table to a compressed file

Exports a table to a compressed file in a Cloud Storage bucket.

View in documentation

Export a table to a CSV file

Exports a table to a CSV file in a Cloud Storage bucket.

View in documentation

Export a table to a JSON file

Exports a table to a newline-delimited JSON file in a Cloud Storage bucket.

Get a model

Get a model resource for a given model ID.

View in documentation

Get a routine

Get a routine resource for a given routine ID.

Get dataset labels

Retrieve the labels of a dataset for a given dataset ID.

View in documentation

Get dataset properties

Retrieve the properties of a dataset.

View in documentation

Get job properties

Retrieve the properties of a job for a given job ID.

View in documentation

Get table labels

Retrieve the labels of a table for a given table ID.

View in documentation

Get table properties

Retrieve the properties of a table for a given table ID.

View in documentation

Get total rows

Run a query and get total rows.

Get view properties

Retrieve the properties of a view for a given view ID.

View in documentation

Grant view access

Authorize and grant access to a view.

Import a local file

Import a local file into a table.

View in documentation

Insert GeoJSON data

Streaming insert into GEOGRAPHY column with GeoJSON data.

View in documentation

Insert rows with no IDs

Insert rows without row IDs in a table.

View in documentation

Insert WKT data

Streaming insert into GEOGRAPHY column with WKT data.

View in documentation

List by label

List datasets, filtering by labels.

View in documentation

List datasets

Lists all existing datasets in a project.

View in documentation

List jobs

List all jobs in a project.

View in documentation

List models

Lists all existing models in a dataset.

View in documentation

List models using streaming

Lists all existing models in the dataset using streaming.

List routines

Lists all existing routines in a dataset.

Load a CSV file

Load a CSV file from Cloud Storage using an explicit schema.

View in documentation

Load a CSV file to replace a table

Load a CSV file from Cloud Storage, replacing a table.

View in documentation

Load a CSV file with autodetect schema

Load a CSV file from Cloud Storage using an autodetected schema.

View in documentation

Load a DataFrame to BigQuery with pandas-gbq

Use the pandas-gbq package to load a DataFrame to BigQuery.

Load a JSON file

Loads a JSON file from Cloud Storage using an explicit schema.

Load a JSON file to replace a table

Load a JSON file from Cloud Storage, replacing a table.

View in documentation

Load a JSON file with autodetect schema

Load a JSON file from Cloud Storage using autodetect schema.

View in documentation

Load a Parquet file

Load a Parquet file from Cloud Storage into a new table.

Load a Parquet to replace a table

Load a Parquet file from Cloud Storage, replacing a table.

Load a table in JSON format

Load a table with customer-managed encryption keys to Cloud Storage in JSON format.

View in documentation

Load an Avro file

Load an Avro file from Cloud Storage into a new table.

View in documentation

Load an Avro file to replace a table

Load an Avro file from Cloud Storage, replacing existing table data.

View in documentation

Load an ORC file

Load an ORC file from Cloud Storage into a new table.

View in documentation

Load an ORC file to replace a table

Load an ORC file from Cloud Storage, replacing a table.

View in documentation

Load data from DataFrame

Load contents of a pandas DataFrame to a table.

Load data into a column-based time partitioning table

Load data into a table that uses column-based time partitioning.

View in documentation

Migration Guide: pandas-gbq

Samples for a guide for migrating from pandas-gbq to google-cloud-bigquery

View in documentation

Migration Guide: pandas-gbq

Samples for a guide for migrating from pandas-gbq to google-cloud-bigquery

View in documentation

Named parameters

Run a query with named parameters.

View in documentation

Named parameters and provided types

Run a query with named parameters and provided parameter types.

Nested repeated schema

Specify nested and repeated columns in schema.

Positional parameters

Run a query with positional parameters.

View in documentation

Positional parameters and provided types

Run a query with positional parameters and provided parameter types.

Preview table data

Retrieve selected row data from a table.

Query a clustered table

Query a table that has a clustering specification.

View in documentation

Query a column-based time-partitioned table

Query a table that uses column-based time partitioning.

Query a table

Query a table with customer-managed encryption keys.

View in documentation

Query Bigtable using a permanent table

Query data from a Bigtable instance by creating a permanent table.

View in documentation

Query Bigtable using a temporary table

Query data from a Bigtable instance by creating a temporary table.

View in documentation

Query Cloud Storage with a permanent table

Query data from a file on Cloud Storage by creating a permanent table.

View in documentation

Query Cloud Storage with a temporary table

Query data from a file on Cloud Storage by creating a temporary table.

Query materialized view

Query an existing materialized view.

Query pagination

Run a query and get rows using automatic pagination.

View in documentation

Query script

Run a query script.

Query Sheets with a permanent table

Query data from a Google Sheets file by creating a permanent table.

View in documentation

Query Sheets with a temporary table

Query data from a Google Sheets file by creating a temporary table.

View in documentation

Relax a column

Change columns from required to nullable.

View in documentation

Relax a column in a load append job

Change a column from required to nullable in a load append job.

View in documentation

Relax a column in a query append job

Change a column from required to nullable in a query append job.

View in documentation

Revoke access to a dataset

Remove a user or group's permissions to access a BigQuery dataset.

View in documentation

Run a legacy SQL query with pandas-gbq

Use the pandas-gbq package to run a query using legacy SQL syntax.

Run a query with batch priority

Run a query job using batch priority.

View in documentation

Run a query with legacy SQL

Runs a query with legacy SQL.