Customer-managed Cloud KMS keys

By default, BigQuery encrypts your content stored at rest. BigQuery handles and manages this default encryption for you without any additional actions on your part. First, data in a BigQuery table is encrypted using a data encryption key. Then, those data encryption keys are encrypted with key encryption keys, which is known as envelope encryption. Key encryption keys don't directly encrypt your data but are used to encrypt the data encryption keys that Google uses to encrypt your data. For more information, see Cloud Key Management Service (KMS).

If you want to control encryption yourself, you can use customer-managed encryption keys (CMEK) for BigQuery. Instead of Google managing the key encryption keys that protect your data, you control and manage key encryption keys in Cloud KMS. This document provides details about this technique.

Learn more about encryption options on Google Cloud. For specific information about CMEK, including its advantages and limitations, see Customer-managed encryption keys.

Before you begin

  • All data assets residing in BigQuery managed storage support CMEK. However, if you are also querying data stored in an external data source such as Cloud Storage that has CMEK-encrypted data, then the data encryption is managed by Cloud Storage. For example, BigLake tables support data encrypted with CMEK in Cloud Storage.

    BigQuery and BigLake tables don't support Customer-Supplied Encryption Keys (CSEK).

  • Decide whether you are going to run BigQuery and Cloud KMS in the same Google Cloud project, or in different projects. For documentation example purposes, the following convention is used:

    • PROJECT_ID: the project ID of the project running BigQuery
    • PROJECT_NUMBER: the project number of the project running BigQuery
    • KMS_PROJECT_ID: the project ID of the project running Cloud KMS (even if this is the same project running BigQuery)
    For information about Google Cloud project IDs and project numbers, see Identifying projects.

  • BigQuery is automatically enabled in new projects. If you are using a pre-existing project to run BigQuery, enable the BigQuery API.

  • For the Google Cloud project that runs Cloud KMS:

    1. Enable the Cloud Key Management Service API.
    2. Create a key ring and a key as described in Creating key rings and keys. Create the key ring in a location that matches the location of your BigQuery dataset:
      • Any multi-regional dataset should use a multi-regional key ring from a matching location. For examples, a dataset in region US should be protected with a key ring from region us, and a dataset in region EU should be protected with a key ring from region europe.
      • Regional datasets should use a matching regional keys. For example, a dataset in region asia-northeast1 should be protected with a key ring from region asia-northeast1.
      • The global region is not supported for use with BigQuery.
      For more information about the supported locations for BigQuery and Cloud KMS, see Cloud locations.

A decryption call is performed using Cloud KMS once per query to a CMEK-encrypted table. For more information, see Cloud KMS pricing.

Encryption specification

Cloud KMS keys used to protect your data in BigQuery are AES-256 keys. These keys are used as key encryption keys in BigQuery, in that they encrypt the data encryption keys that encrypt your data.

Grant encryption and decryption permission

To protect your BigQuery data with a CMEK key, grant the BigQuery service account permission to encrypt and decrypt using that key. The Cloud KMS CryptoKey Encrypter/Decrypter role grants this permission.

Make sure your service account has been created, and then use the Google Cloud console to determine the BigQuery service account ID. Next, provide the service account with the appropriate role to encrypt and decrypt using Cloud KMS.

Trigger creation of your service account

Your BigQuery service account is not initially created when you create a project. To trigger the creation of your service account, enter a command that uses it, such as the bq show --encryption_service_account command, or call the projects.getServiceAccount API method. For example:

bq show --encryption_service_account --project_id=PROJECT_ID

Determine the service account ID

The BigQuery service account ID is of the form:

bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com

The following techniques show how you can determine the BigQuery service account ID for your project.

Console

  1. Go to the Dashboard page in the Google Cloud console.

    Go to the Dashboard page

  2. Click the Select from drop-down list at the top of the page. In the Select From window that appears, select your project.

  3. Both the project ID and project number are displayed on the project Dashboard Project info card:

    project info card

  4. In the following string, replace PROJECT_NUMBER with your project number. The new string identifies your BigQuery service account ID.

    bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com
    

bq

Use the bq show command with the --encryption_service_account flag to determine the service account ID:

bq show --encryption_service_account

The command displays the service account ID:

                  ServiceAccountID
-------------------------------------------------------------
bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com

Assign the Encrypter/Decrypter role

Assign the Cloud KMS CryptoKey Encrypter/Decrypter role to the BigQuery system service account that you copied to your clipboard. This account is of the form:

bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com

Console

  1. Open the Cryptographic Keys page in the Google Cloud console.

    Open the Cryptographic Keys page

  2. Click the name of the key ring that contains the key.

  3. Click the checkbox for the encryption key to which you want to add the role. The Permissions tab opens.

  4. Click Add member.

  5. Enter the email address of the service account, bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com.

    • If the service account is already on the members list, it has existing roles. Click the current role drop-down list for the bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com service account.
  6. Click the drop-down list for Select a role, click Cloud KMS, and then click the Cloud KMS CryptoKey Encrypter/Decrypter role.

  7. Click Save to apply the role to the bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com service account.

gcloud

You can use the Google Cloud CLI to assign the role:

gcloud kms keys add-iam-policy-binding \
--project=KMS_PROJECT_ID \
--member serviceAccount:bq-PROJECT_NUMBER@bigquery-encryption.iam.gserviceaccount.com \
--role roles/cloudkms.cryptoKeyEncrypterDecrypter \
--location=KMS_KEY_LOCATION \
--keyring=KMS_KEY_RING \
KMS_KEY

Replace the following:

  • KMS_PROJECT_ID: the ID of your Google Cloud project that is running Cloud KMS
  • PROJECT_NUMBER: the project number (not project ID) of your Google Cloud project that is running BigQuery
  • KMS_KEY_LOCATION: the location name of your Cloud KMS key
  • KMS_KEY_RING: the key ring name of your Cloud KMS key
  • KMS_KEY: the key name of your Cloud KMS key

Key resource ID

The resource ID for the Cloud KMS key is required for CMEK use, as shown in the examples. This key is case-sensitive and in the form:

projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY

Retrieve the key resource ID

  1. Open the Cryptographic Keys page in the Google Cloud console.

    Open the Cryptographic Keys page

  2. Click the name of the key ring that contains the key.

  3. For the key whose resource ID you are retrieving, click More .

  4. Click Copy Resource Name. The resource ID for the key is copied to your clipboard. The resource ID is also known as the resource name.

Create a table protected by Cloud KMS

To create a table that is protected by Cloud KMS:

Console

  1. Open the BigQuery page in the Google Cloud console.

    Go to the BigQuery page

  2. In the Explorer panel, expand your project and select a dataset.

  3. Expand the Actions option and click Open.

  4. In the details panel, click Create table .

  5. On the Create table page, fill in the information needed to create an empty table with a schema definition. Before you click Create Table, set the encryption type and specify the Cloud KMS key to use with the table:

    1. Click Advanced options.
    2. Click Customer-managed key.
    3. Select the key. If the key you want to use is not listed, enter the resource ID for the key.
  6. Click Create table.

SQL

Use the CREATE TABLE statement with the kms_key_name option:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    CREATE TABLE DATASET_ID.TABLE_ID (
      name STRING, value INT64
    ) OPTIONS (
        kms_key_name
          = 'projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY');
    

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

bq

You can use the bq command-line tool with the --destination_kms_key flag to create the table. The --destination_kms_key flag specifies the resource ID of the key to use with the table.

To create an empty table with a schema:

bq mk --schema name:string,value:integer -t \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
DATASET_ID.TABLE_ID

To create a table from a query:

bq query --destination_table=DATASET_ID.TABLE_ID \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
"SELECT name,count FROM DATASET_ID.TABLE_ID WHERE gender = 'M' ORDER BY count DESC LIMIT 6"

For more information about the bq command-line tool, see Using the bq command-line tool.

Terraform

Use the google_bigquery_table resource.

The following example creates a table named mytable, and also uses the google_kms_crypto_key and google_kms_key_ring resources to specify a Cloud Key Management Service key for the table.

To run this example, you must enable the Cloud Resource Manager API and the Cloud Key Management Service API.

resource "google_bigquery_dataset" "default" {
  dataset_id                      = "mydataset"
  default_partition_expiration_ms = 2592000000  # 30 days
  default_table_expiration_ms     = 31536000000 # 365 days
  description                     = "dataset description"
  location                        = "US"
  max_time_travel_hours           = 96 # 4 days

  labels = {
    billing_group = "accounting",
    pii           = "sensitive"
  }
}

resource "google_bigquery_table" "default" {
  dataset_id          = google_bigquery_dataset.default.dataset_id
  table_id            = "mytable"
  deletion_protection = false # set to "true" in production

  schema = <<EOF
[
  {
    "name": "ID",
    "type": "INT64",
    "mode": "NULLABLE",
    "description": "Item ID"
  },
  {
    "name": "Item",
    "type": "STRING",
    "mode": "NULLABLE"
  }
]
EOF

  encryption_configuration {
    kms_key_name = google_kms_crypto_key.crypto_key.id
  }

  depends_on = [google_project_iam_member.service_account_access]
}

resource "google_kms_crypto_key" "crypto_key" {
  name     = "example-key"
  key_ring = google_kms_key_ring.key_ring.id
}

resource "random_id" "default" {
  byte_length = 8
}

resource "google_kms_key_ring" "key_ring" {
  name     = "${random_id.default.hex}-example-keyring"
  location = "us"
}

# Enable the BigQuery service account to encrypt/decrypt Cloud KMS keys
data "google_project" "project" {
}

resource "google_project_iam_member" "service_account_access" {
  project = data.google_project.project.project_id
  role    = "roles/cloudkms.cryptoKeyEncrypterDecrypter"
  member  = "serviceAccount:bq-${data.google_project.project.number}@bigquery-encryption.iam.gserviceaccount.com"
}

To apply your Terraform configuration in a Google Cloud project, complete the steps in the following sections.

Prepare Cloud Shell

  1. Launch Cloud Shell.
  2. Set the default Google Cloud project where you want to apply your Terraform configurations.

    You only need to run this command once per project, and you can run it in any directory.

    export GOOGLE_CLOUD_PROJECT=PROJECT_ID

    Environment variables are overridden if you set explicit values in the Terraform configuration file.

Prepare the directory

Each Terraform configuration file must have its own directory (also called a root module).

  1. In Cloud Shell, create a directory and a new file within that directory. The filename must have the .tf extension—for example main.tf. In this tutorial, the file is referred to as main.tf.
    mkdir DIRECTORY && cd DIRECTORY && touch main.tf
  2. If you are following a tutorial, you can copy the sample code in each section or step.

    Copy the sample code into the newly created main.tf.

    Optionally, copy the code from GitHub. This is recommended when the Terraform snippet is part of an end-to-end solution.

  3. Review and modify the sample parameters to apply to your environment.
  4. Save your changes.
  5. Initialize Terraform. You only need to do this once per directory.
    terraform init

    Optionally, to use the latest Google provider version, include the -upgrade option:

    terraform init -upgrade

Apply the changes

  1. Review the configuration and verify that the resources that Terraform is going to create or update match your expectations:
    terraform plan

    Make corrections to the configuration as necessary.

  2. Apply the Terraform configuration by running the following command and entering yes at the prompt:
    terraform apply

    Wait until Terraform displays the "Apply complete!" message.

  3. Open your Google Cloud project to view the results. In the Google Cloud console, navigate to your resources in the UI to make sure that Terraform has created or updated them.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// createTableWithCMEK demonstrates creating a table protected with a customer managed encryption key.
func createTableWithCMEK(projectID, datasetID, tableID string) error {
	// projectID := "my-project-id"
	// datasetID := "mydatasetid"
	// tableID := "mytableid"
	ctx := context.Background()

	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	tableRef := client.Dataset(datasetID).Table(tableID)
	meta := &bigquery.TableMetadata{
		EncryptionConfig: &bigquery.EncryptionConfig{
			// TODO: Replace this key with a key you have created in Cloud KMS.
			KMSKeyName: "projects/cloud-samples-tests/locations/us/keyRings/test/cryptoKeys/test",
		},
	}
	if err := tableRef.Create(ctx, meta); err != nil {
		return err
	}
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.EncryptionConfiguration;
import com.google.cloud.bigquery.Field;
import com.google.cloud.bigquery.Schema;
import com.google.cloud.bigquery.StandardSQLTypeName;
import com.google.cloud.bigquery.StandardTableDefinition;
import com.google.cloud.bigquery.TableDefinition;
import com.google.cloud.bigquery.TableId;
import com.google.cloud.bigquery.TableInfo;

// Sample to create a cmek table
public class CreateTableCMEK {

  public static void runCreateTableCMEK() {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    String tableName = "MY_TABLE_NAME";
    String kmsKeyName = "MY_KEY_NAME";
    Schema schema =
        Schema.of(
            Field.of("stringField", StandardSQLTypeName.STRING),
            Field.of("booleanField", StandardSQLTypeName.BOOL));
    // i.e. projects/{project}/locations/{location}/keyRings/{key_ring}/cryptoKeys/{cryptoKey}
    EncryptionConfiguration encryption =
        EncryptionConfiguration.newBuilder().setKmsKeyName(kmsKeyName).build();
    createTableCMEK(datasetName, tableName, schema, encryption);
  }

  public static void createTableCMEK(
      String datasetName, String tableName, Schema schema, EncryptionConfiguration configuration) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      TableId tableId = TableId.of(datasetName, tableName);
      TableDefinition tableDefinition = StandardTableDefinition.of(schema);
      TableInfo tableInfo =
          TableInfo.newBuilder(tableId, tableDefinition)
              .setEncryptionConfiguration(configuration)
              .build();

      bigquery.create(tableInfo);
      System.out.println("Table cmek created successfully");
    } catch (BigQueryException e) {
      System.out.println("Table cmek was not created. \n" + e.toString());
    }
  }
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Protect a new table with a customer-managed encryption key by setting the Table.encryption_configuration property to an EncryptionConfiguration object before creating the table.
from google.cloud import bigquery

client = bigquery.Client()

# TODO(dev): Change table_id to the full name of the table you want to create.
table_id = "your-project.your_dataset.your_table_name"

# Set the encryption key to use for the table.
# TODO: Replace this key with a key you have created in Cloud KMS.
kms_key_name = "projects/your-project/locations/us/keyRings/test/cryptoKeys/test"

table = bigquery.Table(table_id)
table.encryption_configuration = bigquery.EncryptionConfiguration(
    kms_key_name=kms_key_name
)
table = client.create_table(table)  # API request

print(f"Created {table_id}.")
print(f"Key: {table.encryption_configuration.kms_key_name}.")

Query a table protected by a Cloud KMS key

No special arrangements are required to query a table protected by Cloud KMS. BigQuery stores the name of the key used to encrypt the table content and uses that key when a table protected by Cloud KMS is queried.

All existing tools, the BigQuery console, and the bq command-line tool run the same way as with default-encrypted tables, as long as BigQuery has access to the Cloud KMS key used to encrypt the table content.

Protect query results with a Cloud KMS key

By default, query results are stored in a temporary table encrypted with a Google-managed key. To use a Cloud KMS key to encrypt your query results instead, select one of the following options:

Console

  1. Open the BigQuery page in the Google Cloud console.

    Go to the BigQuery page

  2. Click Compose new query.

  3. Enter a valid GoogleSQL query in the query text area.

  4. Click More, click Query settings, then click Advanced options.

  5. Select Customer-managed encryption.

  6. Select the key. If the key you want to use is not listed, enter the resource ID for the key.

  7. Click Save.

  8. Click Run.

bq

Specify the flag --destination_kms_key to protect the destination table or query results (if using a temporary table) with your Cloud KMS key. The --destination_kms_key flag specifies the resource ID of the key to use with the destination or resulting table.

Optionally use the --destination_table flag to specify the destination for query results. If --destination_table is not used, the query results are written to a temporary table.

To query a table:

bq query \
--destination_table=DATASET_ID.TABLE_ID \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
"SELECT name,count FROM DATASET_ID.TABLE_ID WHERE gender = 'M' ORDER BY count DESC LIMIT 6"

For more information about the bq command-line tool, see Using the bq command-line tool.

Go

import (
	"context"
	"fmt"
	"io"

	"cloud.google.com/go/bigquery"
	"google.golang.org/api/iterator"
)

// queryWithDestinationCMEK demonstrates saving query results to a destination table and protecting those results
// by specifying a customer managed encryption key.
func queryWithDestinationCMEK(w io.Writer, projectID, dstDatasetID, dstTableID string) error {
	// projectID := "my-project-id"
	// datasetID := "mydataset"
	// tableID := "mytable"
	ctx := context.Background()
	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	q := client.Query("SELECT 17 as my_col")
	q.Location = "US" // Location must match the dataset(s) referenced in query.
	q.QueryConfig.Dst = client.Dataset(dstDatasetID).Table(dstTableID)
	q.DestinationEncryptionConfig = &bigquery.EncryptionConfig{
		// TODO: Replace this key with a key you have created in Cloud KMS.
		KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
	}
	// Run the query and print results when the query job is completed.
	job, err := q.Run(ctx)
	if err != nil {
		return err
	}
	status, err := job.Wait(ctx)
	if err != nil {
		return err
	}
	if err := status.Err(); err != nil {
		return err
	}
	it, err := job.Read(ctx)
	for {
		var row []bigquery.Value
		err := it.Next(&row)
		if err == iterator.Done {
			break
		}
		if err != nil {
			return err
		}
		fmt.Fprintln(w, row)
	}
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Protect a new table with a customer-managed encryption key by setting the Table.encryption_configuration property to an EncryptionConfiguration object before creating the table.
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.EncryptionConfiguration;
import com.google.cloud.bigquery.QueryJobConfiguration;
import com.google.cloud.bigquery.TableResult;

// Sample to query on destination table with encryption key
public class QueryDestinationTableCMEK {

  public static void runQueryDestinationTableCMEK() {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    String tableName = "MY_TABLE_NAME";
    String kmsKeyName = "MY_KMS_KEY_NAME";
    String query =
        String.format("SELECT stringField, booleanField FROM %s.%s", datasetName, tableName);
    EncryptionConfiguration encryption =
        EncryptionConfiguration.newBuilder().setKmsKeyName(kmsKeyName).build();
    queryDestinationTableCMEK(query, encryption);
  }

  public static void queryDestinationTableCMEK(String query, EncryptionConfiguration encryption) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      QueryJobConfiguration config =
          QueryJobConfiguration.newBuilder(query)
              // Set the encryption key to use for the destination.
              .setDestinationEncryptionConfiguration(encryption)
              .build();

      TableResult results = bigquery.query(config);

      results
          .iterateAll()
          .forEach(row -> row.forEach(val -> System.out.printf("%s,", val.toString())));
      System.out.println("Query performed successfully with encryption key.");
    } catch (BigQueryException | InterruptedException e) {
      System.out.println("Query not performed \n" + e.toString());
    }
  }
}

Python

from google.cloud import bigquery

# Construct a BigQuery client object.
client = bigquery.Client()

# TODO(developer): Set table_id to the ID of the destination table.
# table_id = "your-project.your_dataset.your_table_name"

# Set the encryption key to use for the destination.
# TODO(developer): Replace this key with a key you have created in KMS.
# kms_key_name = "projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}".format(
#     your-project, location, your-ring, your-key
# )

job_config = bigquery.QueryJobConfig(
    destination=table_id,
    destination_encryption_configuration=bigquery.EncryptionConfiguration(
        kms_key_name=kms_key_name
    ),
)

# Start the query, passing in the extra configuration.
query_job = client.query(
    "SELECT 17 AS my_col;", job_config=job_config
)  # Make an API request.
query_job.result()  # Wait for the job to complete.

table = client.get_table(table_id)  # Make an API request.
if table.encryption_configuration.kms_key_name == kms_key_name:
    print("The destination table is written using the encryption configuration")

Load a table protected by Cloud KMS

To load a data file into a table that is protected by Cloud KMS:

Console

Protect a load job destination table with a customer-managed encryption key by specifying the key when you load the table.

  1. Open the BigQuery page in the Google Cloud console.

    Go to the BigQuery page

  2. In the Explorer panel, expand your project and select a dataset.

  3. In the details panel, click Create table.

  4. Enter the options you want to use for loading the table, but before you click Create table, click Advanced options.

  5. Under Encryption, select Customer-managed key.

  6. Click the Select a customer-managed key drop-down list and select the key to use. If you don't see any keys available, enter a key resource ID.

    Advanced options.

  7. Click Create table.

bq

Protect a load job destination table with a customer-managed encryption key by setting the --destination_kms_key flag.

bq --location=LOCATION load \
--autodetect \
--source_format=FORMAT \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
DATASET.TABLE \
path_to_source
For example:
bq load \
--autodetect \
--source_format=NEWLINE_DELIMITED_JSON \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
test2.table4 \
gs://cloud-samples-data/bigquery/us-states/us-states.json

Go

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// importJSONWithCMEK demonstrates loading newline-delimited JSON from Cloud Storage,
// and protecting the data with a customer-managed encryption key.
func importJSONWithCMEK(projectID, datasetID, tableID string) error {
	// projectID := "my-project-id"
	// datasetID := "mydataset"
	// tableID := "mytable"
	ctx := context.Background()
	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	gcsRef := bigquery.NewGCSReference("gs://cloud-samples-data/bigquery/us-states/us-states.json")
	gcsRef.SourceFormat = bigquery.JSON
	gcsRef.AutoDetect = true
	loader := client.Dataset(datasetID).Table(tableID).LoaderFrom(gcsRef)
	loader.WriteDisposition = bigquery.WriteEmpty
	loader.DestinationEncryptionConfig = &bigquery.EncryptionConfig{
		// TODO: Replace this key with a key you have created in KMS.
		KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
	}

	job, err := loader.Run(ctx)
	if err != nil {
		return err
	}
	status, err := job.Wait(ctx)
	if err != nil {
		return err
	}

	if status.Err() != nil {
		return fmt.Errorf("job completed with error: %v", status.Err())
	}

	return nil
}

Java

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.EncryptionConfiguration;
import com.google.cloud.bigquery.FormatOptions;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.LoadJobConfiguration;
import com.google.cloud.bigquery.TableId;

// Sample to load JSON data with configuration key from Cloud Storage into a new BigQuery table
public class LoadJsonFromGCSCMEK {

  public static void runLoadJsonFromGCSCMEK() {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    String tableName = "MY_TABLE_NAME";
    String kmsKeyName = "MY_KMS_KEY_NAME";
    String sourceUri = "gs://cloud-samples-data/bigquery/us-states/us-states.json";
    // i.e. projects/{project}/locations/{location}/keyRings/{key_ring}/cryptoKeys/{cryptoKey}
    EncryptionConfiguration encryption =
        EncryptionConfiguration.newBuilder().setKmsKeyName(kmsKeyName).build();
    loadJsonFromGCSCMEK(datasetName, tableName, sourceUri, encryption);
  }

  public static void loadJsonFromGCSCMEK(
      String datasetName, String tableName, String sourceUri, EncryptionConfiguration encryption) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      TableId tableId = TableId.of(datasetName, tableName);
      LoadJobConfiguration loadConfig =
          LoadJobConfiguration.newBuilder(tableId, sourceUri)
              // Set the encryption key to use for the destination.
              .setDestinationEncryptionConfiguration(encryption)
              .setFormatOptions(FormatOptions.json())
              .setAutodetect(true)
              .build();

      // Load data from a GCS JSON file into the table
      Job job = bigquery.create(JobInfo.of(loadConfig));
      // Blocks until this load table job completes its execution, either failing or succeeding.
      job = job.waitFor();
      if (job.isDone()) {
        System.out.println("Table loaded succesfully from GCS with configuration key");
      } else {
        System.out.println(
            "BigQuery was unable to load into the table due to an error:"
                + job.getStatus().getError());
      }
    } catch (BigQueryException | InterruptedException e) {
      System.out.println("Column not added during load append \n" + e.toString());
    }
  }
}

Python

Protect a load job destination table with a customer-managed encryption key by setting the LoadJobConfig.destination_encryption_configuration property to an EncryptionConfiguration and load the table.

from google.cloud import bigquery

# Construct a BigQuery client object.
client = bigquery.Client()

# TODO(developer): Set table_id to the ID of the table to create.
# table_id = "your-project.your_dataset.your_table_name

# Set the encryption key to use for the destination.
# TODO: Replace this key with a key you have created in KMS.
# kms_key_name = "projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}".format(
#     "cloud-samples-tests", "us", "test", "test"
# )

job_config = bigquery.LoadJobConfig(
    autodetect=True,
    source_format=bigquery.SourceFormat.NEWLINE_DELIMITED_JSON,
    destination_encryption_configuration=bigquery.EncryptionConfiguration(
        kms_key_name=kms_key_name
    ),
)

uri = "gs://cloud-samples-data/bigquery/us-states/us-states.json"

load_job = client.load_table_from_uri(
    uri,
    table_id,
    location="US",  # Must match the destination dataset location.
    job_config=job_config,
)  # Make an API request.

assert load_job.job_type == "load"

load_job.result()  # Waits for the job to complete.

assert load_job.state == "DONE"
table = client.get_table(table_id)

if table.encryption_configuration.kms_key_name == kms_key_name:
    print("A table loaded with encryption configuration key")

Stream into a table protected by Cloud KMS

You can stream data into your CMEK-protected BigQuery table without specifying any additional parameters. Note that this data is encrypted using your Cloud KMS key in the buffer as well as in the final location. Before using streaming with a CMEK table, review the requirements on key availability and accessibility.

Learn more about streaming at Streaming data into BigQuery.

Change a table from default encryption to Cloud KMS protection

bq

You can use the bq cp command with the --destination_kms_key flag to copy a table protected by default encryption into a new table, or into the original table, protected by Cloud KMS. The --destination_kms_key flag specifies the resource ID of the key to use with the destination table.

To copy a table that has default encryption to a new table that has Cloud KMS protection:

bq cp \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
SOURCE_DATASET_ID.SOURCE_TABLE_ID DESTINATION_DATASET_ID.DESTINATION_TABLE_ID

In you want to copy a table that has default encryption to the same table with Cloud KMS protection:

bq cp -f \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
DATASET_ID.TABLE_ID DATASET_ID.TABLE_ID

If you want to change a table from Cloud KMS protection to default encryption, copy the file to itself by running bq cp without using the --destination_kms_key flag.

For more information about the bq command-line tool, see Using the bq command-line tool.

Go

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// copyTableWithCMEK demonstrates creating a copy of a table and ensuring the copied data is
// protected with a customer managed encryption key.
func copyTableWithCMEK(projectID, datasetID, tableID string) error {
	// projectID := "my-project-id"
	// datasetID := "mydataset"
	// tableID := "mytable"
	ctx := context.Background()
	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	srcTable := client.DatasetInProject("bigquery-public-data", "samples").Table("shakespeare")
	copier := client.Dataset(datasetID).Table(tableID).CopierFrom(srcTable)
	copier.DestinationEncryptionConfig = &bigquery.EncryptionConfig{
		// TODO: Replace this key with a key you have created in Cloud KMS.
		KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
	}
	job, err := copier.Run(ctx)
	if err != nil {
		return err
	}
	status, err := job.Wait(ctx)
	if err != nil {
		return err
	}
	if err := status.Err(); err != nil {
		return err
	}
	return nil
}

Java

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.CopyJobConfiguration;
import com.google.cloud.bigquery.EncryptionConfiguration;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.TableId;

// Sample to copy a cmek table
public class CopyTableCMEK {

  public static void runCopyTableCMEK() {
    // TODO(developer): Replace these variables before running the sample.
    String destinationDatasetName = "MY_DESTINATION_DATASET_NAME";
    String destinationTableId = "MY_DESTINATION_TABLE_NAME";
    String sourceDatasetName = "MY_SOURCE_DATASET_NAME";
    String sourceTableId = "MY_SOURCE_TABLE_NAME";
    String kmsKeyName = "MY_KMS_KEY_NAME";
    EncryptionConfiguration encryption =
        EncryptionConfiguration.newBuilder().setKmsKeyName(kmsKeyName).build();
    copyTableCMEK(
        sourceDatasetName, sourceTableId, destinationDatasetName, destinationTableId, encryption);
  }

  public static void copyTableCMEK(
      String sourceDatasetName,
      String sourceTableId,
      String destinationDatasetName,
      String destinationTableId,
      EncryptionConfiguration encryption) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      TableId sourceTable = TableId.of(sourceDatasetName, sourceTableId);
      TableId destinationTable = TableId.of(destinationDatasetName, destinationTableId);

      // For more information on CopyJobConfiguration see:
      // https://googleapis.dev/java/google-cloud-clients/latest/com/google/cloud/bigquery/JobConfiguration.html
      CopyJobConfiguration configuration =
          CopyJobConfiguration.newBuilder(destinationTable, sourceTable)
              .setDestinationEncryptionConfiguration(encryption)
              .build();

      // For more information on Job see:
      // https://googleapis.dev/java/google-cloud-clients/latest/index.html?com/google/cloud/bigquery/package-summary.html
      Job job = bigquery.create(JobInfo.of(configuration));

      // Blocks until this job completes its execution, either failing or succeeding.
      Job completedJob = job.waitFor();
      if (completedJob == null) {
        System.out.println("Job not executed since it no longer exists.");
        return;
      } else if (completedJob.getStatus().getError() != null) {
        System.out.println(
            "BigQuery was unable to copy table due to an error: \n" + job.getStatus().getError());
        return;
      }
      System.out.println("Table cmek copied successfully.");
    } catch (BigQueryException | InterruptedException e) {
      System.out.println("Table cmek copying job was interrupted. \n" + e.toString());
    }
  }
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Protect the destination of a table copy with a customer-managed encryption key by setting the QueryJobConfig.destination_encryption_configuration property to an EncryptionConfiguration and copy the table.

from google.cloud import bigquery

# Construct a BigQuery client object.
client = bigquery.Client()

# TODO(developer): Set dest_table_id to the ID of the destination table.
# dest_table_id = "your-project.your_dataset.your_table_name"

# TODO(developer): Set orig_table_id to the ID of the original table.
# orig_table_id = "your-project.your_dataset.your_table_name"

# Set the encryption key to use for the destination.
# TODO(developer): Replace this key with a key you have created in KMS.
# kms_key_name = "projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}".format(
#     your-project, location, your-ring, your-key
# )

job_config = bigquery.CopyJobConfig(
    destination_encryption_configuration=bigquery.EncryptionConfiguration(
        kms_key_name=kms_key_name
    )
)
job = client.copy_table(orig_table_id, dest_table_id, job_config=job_config)
job.result()  # Wait for the job to complete.

dest_table = client.get_table(dest_table_id)  # Make an API request.
if dest_table.encryption_configuration.kms_key_name == kms_key_name:
    print("A copy of the table created")

Determine if a table is protected by Cloud KMS

  1. In the Google Cloud console, click the blue arrow to the left of your dataset to expand it, or double-click the dataset name. This displays the tables and views in the dataset.

  2. Click the table name.

  3. Click Details. The Table Details page displays the table's description and table information.

  4. If the table is protected by Cloud KMS, the Customer-Managed Encryption Key field displays the key resource ID.

    Protected table.

Change the Cloud KMS key for a BigQuery table

To change the Cloud KMS key of an existing CMEK-protected table, you can run an ALTER TABLE query, use the API, or use the bq command-line tool. There are two ways to modify the Cloud KMS key using the API and the bq command-line tool: update or cp.

If you use update, you can change the Cloud KMS key used for a CMEK-protected table.

If you use cp, you can change the Cloud KMS key used for a CMEK-protected table, change a table from default encryption to CMEK-protection, or change a table from CMEK-protection to default encryption.

An advantage of update is it is faster than cp and it lets you use table decorators.

SQL

Use the ALTER TABLE SET OPTIONS statement to update the kms_key_name field for a table:

  1. In the Google Cloud console, go to the BigQuery page.

    Go to BigQuery

  2. In the query editor, enter the following statement:

    ALTER TABLE DATASET_ID.mytable
    SET OPTIONS (
      kms_key_name
        = 'projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY');
    

  3. Click Run.

For more information about how to run queries, see Run an interactive query.

bq

You can use the bq cp command with the --destination_kms_key flag to change the key for a table protected by Cloud KMS. The --destination_kms_key flag specifies the resource ID of the key to use with the table.

bq update \
--destination_kms_key projects/KMS_PROJECT_ID/locations/LOCATION/keyRings/KEY_RING/cryptoKeys/KEY \
-t DATASET_ID.TABLE_ID

Go

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// updateTableChangeCMEK demonstrates how to change the customer managed encryption key that protects a table.
func updateTableChangeCMEK(projectID, datasetID, tableID string) error {
	// projectID := "my-project-id"
	// datasetID := "mydatasetid"
	// tableID := "mytableid"
	ctx := context.Background()

	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	tableRef := client.Dataset(datasetID).Table(tableID)
	meta, err := tableRef.Metadata(ctx)
	if err != nil {
		return err
	}
	update := bigquery.TableMetadataToUpdate{
		EncryptionConfig: &bigquery.EncryptionConfig{
			// TODO: Replace this key with a key you have created in Cloud KMS.
			KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/otherkey",
		},
	}
	if _, err := tableRef.Update(ctx, update, meta.ETag); err != nil {
		return err
	}
	return nil
}

Java

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.EncryptionConfiguration;
import com.google.cloud.bigquery.Table;
import com.google.cloud.bigquery.TableId;

// Sample to update a cmek table
public class UpdateTableCMEK {

  public static void runUpdateTableCMEK() {
    // TODO(developer): Replace these variables before running the sample.
    String datasetName = "MY_DATASET_NAME";
    String tableName = "MY_TABLE_NAME";
    String kmsKeyName = "MY_KEY_NAME";
    // Set a new encryption key to use for the destination.
    // i.e. projects/{project}/locations/{location}/keyRings/{key_ring}/cryptoKeys/{cryptoKey}
    EncryptionConfiguration encryption =
        EncryptionConfiguration.newBuilder().setKmsKeyName(kmsKeyName).build();
    updateTableCMEK(datasetName, tableName, encryption);
  }

  public static void updateTableCMEK(
      String datasetName, String tableName, EncryptionConfiguration encryption) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      Table table = bigquery.getTable(TableId.of(datasetName, tableName));
      bigquery.update(table.toBuilder().setEncryptionConfiguration(encryption).build());
      System.out.println("Table cmek updated successfully");
    } catch (BigQueryException e) {
      System.out.println("Table cmek was not updated. \n" + e.toString());
    }
  }
}

Python

Change the customer-managed encryption key for a table by changing the Table.encryption_configuration property to a new EncryptionConfiguration object and update the table.