Protecting Data with Cloud KMS Keys

By default, BigQuery encrypts customer content stored at rest. BigQuery handles and manages this default encryption for you without any additional actions on your part. First, data in a BigQuery table is encrypted using a data encryption key. Then, those data encryption keys are encrypted with key encryption keys, which is known as envelope encryption. Key encryption keys do not directly encrypt your data but are used to encrypt the data encryption keys that Google uses to encrypt your data. For more information, see Key management.

If you want to control encryption yourself, you can use customer-managed encryption keys (CMEK) for BigQuery. Instead of Google managing the key encryption keys that protect your data, you control and manage key encryption keys in Cloud KMS. This topic provides details about this technique.

Learn more about encryption options on Google Cloud Platform.

Before you begin

  1. Understand datasets, tables, and queries.

  2. Decide whether you are going to run BigQuery and Cloud KMS in the same GCP project, or in different projects. For documentation example purposes, the following convention is used:

    • [PROJECT_ID] is the project ID of the project running BigQuery
    • [PROJECT_NUMBER] is the project number of the project running BigQuery
    • [KMS_PROJECT_ID] is the project ID of the project running Cloud KMS (even if this is the same project running BigQuery)
    For information about GCP project IDs and project numbers, see Identifying projects.

  3. BigQuery is automatically enabled in new projects. If you are using a pre-existing project to run BigQuery, enable the BigQuery API.

  4. For the GCP project that runs Cloud KMS:

    1. Enable the Cloud KMS API.
    2. Create a key ring and a key as described in Creating Key Rings and Keys. Create the key ring in a location that matches the location of your BigQuery dataset:
      • Any multi-regional dataset should use a multi-regional key ring from a matching location. For examples, a dataset in region US should be protected with a key ring from region us, and a dataset in region EU should be protected with a key ring from region europe.
      • Regional datasets should use a matching regional keys. For example, a dataset in region asia-northeast1 should be protected with a key ring from region asia-northeast1.
      • The global region is not supported for use with BigQuery.
      For more information about the supported locations for BigQuery and Cloud KMS, see Cloud Locations.

Encryption specification

Cloud KMS keys used to protect your data in BigQuery are AES-256 keys. These keys are used as key encryption keys in BigQuery, in that they encrypt the data encryption keys that encrypt your data.

Grant encryption and decryption permission

Use the Google Cloud Platform Console to determine the BigQuery service account ID, and provide the service account with the appropriate role to encrypt and decrypt using Cloud KMS.

Determine the service account ID

command line

You can use the bq show command with the --encryption_service_account flag to determine the service account ID:

bq show --encryption_service_account

The command displays the service account ID:

                       ServiceAccountID
     -------------------------------------------------------------
      bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com

Web UI

  1. Go to the BigQuery web UI.

    Go to the BigQuery web UI

  2. Click the down arrow icon down arrow icon next to your project name in the navigation and then click Customer Managed Encryption.

  3. A user dialog opens to show you the service account that requires encryption and decryption permission:

    service account ID

  4. Click Copy to copy the service account ID to your clipboard and then click OK to close the user dialog.

Assign the Encrypter/Decrypter role

Assign the Cloud KMS CryptoKey Encrypter/Decrypter role to the BigQuery system service account that you copied to your clipboard. This account is of the form

bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com

command line

You can use the gcloud command-line tool to assign the role:

gcloud kms keys add-iam-policy-binding \
--project=[KMS_PROJECT_ID] \
--member serviceAccount:bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com \
--role roles/cloudkms.cryptoKeyEncrypterDecrypter \
--location=[KMS_KEY_LOCATION] \
--keyring=[KMS_KEY_RING] \
[KMS_KEY]
Replace [KMS_PROJECT_ID] with the ID of your GCP project that is running Cloud KMS, replace [PROJECT_NUMBER] with the project number (not project ID) of your GCP project that is running BigQuery, and replace [KMS_KEY_LOCATION], [KMS_KEY_RING], [KMS_KEY_RING], [KMS_KEY] with location, key ring and key names of your Cloud KMS key.

Web UI

  1. Open the Security page in the GCP Console.

    Open the Security Cryptographic Keys page

  2. Select your project and click Continue.

  3. Identify the encryption key to which you want to add the role.

    • If the bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com service account isn't already on the members list, it doesn't have any roles assigned to it. Click Add member and enter the email address of the service account, bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com.
    • If the service account is already on the members list, it has existing roles. Click the current role drop-down list for the bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com service account.
  4. Click the drop-down list for Role, click Cloud KMS, and then click the Cloud KMS CryptoKey Encrypter/Decrypter role.

  5. Click Add or Save to apply the role to the bq-[PROJECT_NUMBER]@bigquery-encryption.iam.gserviceaccount.com service account.

Create a table protected by Cloud KMS

Create an empty table protected by Cloud KMS

To create a table that is protected by Cloud KMS:

command line

You can use the bq command-line tool with the --destination_kms_key flag to create the table. The --destination_kms_key flag specifies the resource ID of the key to use with the table. This key is in the form:

projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]
For information on how to retrieve the key resource ID, see Key resource ID.

To create an empty table with a schema:

bq mk --schema name:string,value:integer -t \
--destination_kms_key projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY] \
mydataset.newtable

Alternatively, you can use a DDL statement:

bq query --use_legacy_sql=false "
  CREATE TABLE mydataset.newtable (name STRING, value INT64)
  OPTIONS(
    kms_key_name='projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]'
  )
"

To create a table from a query:

bq query --destination_table=mydataset.newtable \
--destination_kms_key projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY] \
"SELECT name,count FROM mydataset.babynames WHERE gender = 'M' ORDER BY count DESC LIMIT 6"

For more information about the bq command-line tool, see bq Command-line Tool.

Web UI

  1. Click the down arrow icon down arrow icon next to your dataset name in the BigQuery web user interface and then click Create new table.

  2. On the Create table page, fill in the information needed to either create an empty table with no schema or create an empty table with a schema definition. Before you click Create Table, set the encryption type and specify the Cloud KMS key to use with the table:

    1. Click on the drop-down list for Encryption Type and select Customer-Managed Encryption.
    2. For Customer-Managed Encryption Key, enter the resource ID for the key. This key is in the form:
       projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]
       
      For information on how to retrieve the key resource ID, see Key resource ID.
  3. Click Create Table.

Go

// To run this sample, you will need to create (or reuse) a context and
// an instance of the bigquery client.  For example:
// import "cloud.google.com/go/bigquery"
// ctx := context.Background()
// client, err := bigquery.NewClient(ctx, "your-project-id")
tableRef := client.Dataset(datasetID).Table(tableID)
meta := &bigquery.TableMetadata{
	EncryptionConfig: &bigquery.EncryptionConfig{
		// TODO: Replace this key with a key you have created in Cloud KMS.
		KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
	},
}
if err := tableRef.Create(ctx, meta); err != nil {
	return err
}

Python

Protect a new table with a customer managed encryption key by setting the Table.encryption_configuration property to an EncryptionConfiguration object before creating the table.

# from google.cloud import bigquery
# client = bigquery.Client()
# dataset_id = 'my_dataset'

table_ref = client.dataset(dataset_id).table('my_table')
table = bigquery.Table(table_ref)

# Set the encryption key to use for the table.
# TODO: Replace this key with a key you have created in Cloud KMS.
kms_key_name = 'projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}'.format(
    'cloud-samples-tests', 'us-central1', 'test', 'test')
table.encryption_configuration = bigquery.EncryptionConfiguration(
    kms_key_name=kms_key_name)

table = client.create_table(table)  # API request

assert table.encryption_configuration.kms_key_name == kms_key_name

Query a table protected by a Cloud KMS key

No special arrangements are required to query a table protected by Cloud KMS. BigQuery stores the name of the key used to encrypt the table content and will use that key when a table protected by Cloud KMS is queried.

All existing tools, the BigQuery console, and the bq command-line interface run the same way as with default-encrypted tables, as long as BigQuery has access to the Cloud KMS key used to encrypt the table content.

Protect query results with Cloud KMS key

command line

Specify the flag --destination_kms_key to protect the destination table or query results (if using a temporary table) with your Cloud KMS key. The --destination_kms_key flag specifies the resource ID of the key to use with the destination or resulting table. This key is in the form:

projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]
For information on how to retrieve the key resource ID, see Key resource ID.

Optionally use the --destination_table flag to specify the destination for query results. If --destination_table is not used, the query results will be written to a temporary table.

To query a table:

bq query \
--destination_table=mydataset.newtable \
--destination_kms_key projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY] \
"SELECT name,count FROM mydataset.babynames WHERE gender = 'M' ORDER BY count DESC LIMIT 6"

For more information about the bq command-line tool, see bq Command-line Tool.

Web UI

  1. Click the Compose query button in the BigQuery web user interface.

  2. Enter a valid BigQuery SQL query in the New Query text area.

  3. Click Encryption Type and select Customer-Managed Encryption.

  4. For Customer-Managed Encryption Key, enter the resource ID for the key. This key is in the form:

    projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]
    
    For information on how to retrieve the key resource ID, see Key resource ID.

  5. Click RunQuery.

Go

q := client.Query("SELECT 17 as my_col")
q.Location = "US" // Location must match the dataset(s) referenced in query.
q.QueryConfig.Dst = client.Dataset(destDatasetID).Table(destTableID)
q.DestinationEncryptionConfig = &bigquery.EncryptionConfig{
	// TODO: Replace this key with a key you have created in Cloud KMS.
	KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
}
return runAndRead(ctx, client, q)
job, err := q.Run(ctx)
if err != nil {
	return err
}
status, err := job.Wait(ctx)
if err != nil {
	return err
}
if err := status.Err(); err != nil {
	return err
}
it, err := job.Read(ctx)
for {
	var row []bigquery.Value
	err := it.Next(&row)
	if err == iterator.Done {
		break
	}
	if err != nil {
		return err
	}
	fmt.Println(row)
}

Python

Protect a query destination table with a customer managed encryption key by setting the QueryJobConfig.destination_encryption_configuration property to an EncryptionConfiguration and run the query.

# from google.cloud import bigquery
# client = bigquery.Client()

job_config = bigquery.QueryJobConfig()

# Set the destination table. Here, dataset_id is a string, such as:
# dataset_id = 'your_dataset_id'
table_ref = client.dataset(dataset_id).table('your_table_id')
job_config.destination = table_ref

# Set the encryption key to use for the destination.
# TODO: Replace this key with a key you have created in KMS.
kms_key_name = 'projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}'.format(
    'cloud-samples-tests', 'us-central1', 'test', 'test')
encryption_config = bigquery.EncryptionConfiguration(
    kms_key_name=kms_key_name)
job_config.destination_encryption_configuration = encryption_config

# Start the query, passing in the extra configuration.
query_job = client.query(
    'SELECT 17 AS my_col;',
    # Location must match that of the dataset(s) referenced in the query
    # and of the destination table.
    location='US',
    job_config=job_config)  # API request - starts the query
query_job.result()

# The destination table is written using the encryption configuration.
table = client.get_table(table_ref)
assert table.encryption_configuration.kms_key_name == kms_key_name

Load a table protected by Cloud KMS

To load a data file into a table that is protected by Cloud KMS:

Go

// To run this sample, you will need to create (or reuse) a context and
// an instance of the bigquery client.  For example:
// import "cloud.google.com/go/bigquery"
// ctx := context.Background()
// client, err := bigquery.NewClient(ctx, "your-project-id")
gcsRef := bigquery.NewGCSReference("gs://cloud-samples-data/bigquery/us-states/us-states.json")
gcsRef.SourceFormat = bigquery.JSON
gcsRef.AutoDetect = true
loader := client.Dataset(datasetID).Table(tableID).LoaderFrom(gcsRef)
loader.WriteDisposition = bigquery.WriteEmpty
loader.DestinationEncryptionConfig = &bigquery.EncryptionConfig{
	// TODO: Replace this key with a key you have created in KMS.
	KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
}

job, err := loader.Run(ctx)
if err != nil {
	return err
}
status, err := job.Wait(ctx)
if err != nil {
	return err
}

if status.Err() != nil {
	return fmt.Errorf("Job completed with error: %v", status.Err())
}

Python

Protect a load job destination table with a customer managed encryption key by setting the LoadJobConfig.destination_encryption_configuration property to an EncryptionConfiguration and load the table.

# from google.cloud import bigquery
# client = bigquery.Client()
# dataset_id = 'my_dataset'

dataset_ref = client.dataset(dataset_id)
job_config = bigquery.LoadJobConfig()
job_config.autodetect = True
job_config.source_format = bigquery.SourceFormat.NEWLINE_DELIMITED_JSON

# Set the encryption key to use for the destination.
# TODO: Replace this key with a key you have created in KMS.
kms_key_name = 'projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}'.format(
    'cloud-samples-tests', 'us-central1', 'test', 'test')
encryption_config = bigquery.EncryptionConfiguration(
    kms_key_name=kms_key_name)
job_config.destination_encryption_configuration = encryption_config
uri = 'gs://cloud-samples-data/bigquery/us-states/us-states.json'

load_job = client.load_table_from_uri(
    uri,
    dataset_ref.table('us_states'),
    location='US',  # Location must match that of the destination dataset.
    job_config=job_config)  # API request

assert load_job.job_type == 'load'

load_job.result()  # Waits for table load to complete.

assert load_job.state == 'DONE'
table = client.get_table(dataset_ref.table('us_states'))
assert table.encryption_configuration.kms_key_name == kms_key_name

Stream into a table protected by Cloud KMS

You can stream data into your CMEK-protected BigQuery table without specifying any additional parameters. Note that this data is encrypted using your Cloud KMS key in the buffer as well as in the final location. Before using streaming with a CMEK table, review the requirements on key availability and accessibility.

Learn more about streaming at Streaming Data into BigQuery.

Change a table from default encryption to Cloud KMS protection

command line

You can use the bq cp command with the --destination_kms_key flag to copy a table protected by default encryption into a new table, or into the original table, protected by Cloud KMS. The --destination_kms_key flag specifies the resource ID of the key to use with the destination table. This key is in the form:

projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]

For information on how to retrieve the key resource ID, see Key resource ID.

To copy a table that has default encryption to a new table that has Cloud KMS protection:

bq cp \
--destination_kms_key projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY] \
sourceDataset.sourceTableId destinationDataset.destinationTableId

In you want to copy a table that has default encryption to the same table with Cloud KMS protection:

bq cp -f \
--destination_kms_key projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY] \
sourceDataset.sourceTableId sourceDataset.sourceTableId

If you want to change a table from Cloud KMS protection to default encryption, copy the file to itself by running bq cp without using the --destination_kms_key flag.

For more information about the bq command-line tool, see bq Command-line Tool.

Go

// To run this sample, you will need to create (or reuse) a context and
// an instance of the bigquery client.  For example:
// import "cloud.google.com/go/bigquery"
// ctx := context.Background()
// client, err := bigquery.NewClient(ctx, "your-project-id")
srcTable := client.DatasetInProject("bigquery-public-data", "samples").Table("shakespeare")
copier := client.Dataset(datasetID).Table(tableID).CopierFrom(srcTable)
copier.DestinationEncryptionConfig = &bigquery.EncryptionConfig{
	// TODO: Replace this key with a key you have created in Cloud KMS.
	KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/test",
}
job, err := copier.Run(ctx)
if err != nil {
	return err
}
status, err := job.Wait(ctx)
if err != nil {
	return err
}
if err := status.Err(); err != nil {
	return err
}

Python

Protect the destination of a table copy with a customer managed encryption key by setting the QueryJobConfig.destination_encryption_configuration property to an EncryptionConfiguration and copy the table.

# from google.cloud import bigquery
# client = bigquery.Client()

source_dataset = bigquery.DatasetReference(
    'bigquery-public-data', 'samples')
source_table_ref = source_dataset.table('shakespeare')

# dataset_id = 'my_dataset'
dest_dataset_ref = client.dataset(dataset_id)
dest_table_ref = dest_dataset_ref.table('destination_table')

# Set the encryption key to use for the destination.
# TODO: Replace this key with a key you have created in KMS.
kms_key_name = 'projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}'.format(
    'cloud-samples-tests', 'us-central1', 'test', 'test')
encryption_config = bigquery.EncryptionConfiguration(
    kms_key_name=kms_key_name)
job_config = bigquery.CopyJobConfig()
job_config.destination_encryption_configuration = encryption_config

job = client.copy_table(
    source_table_ref,
    dest_table_ref,
    # Location must match that of the source and destination tables.
    location='US',
    job_config=job_config)  # API request
job.result()  # Waits for job to complete.

assert job.state == 'DONE'
dest_table = client.get_table(dest_table_ref)
assert dest_table.encryption_configuration.kms_key_name == kms_key_name

Determine if a table is protected by Cloud KMS

  1. In the BigQuery web user interface, click the blue arrow to the left of your dataset to expand it, or double-click the dataset name. This displays the tables and views in the dataset.

  2. Click the table name.

  3. Click Details. The Table Details page displays the table's description and table information.

  4. If the table is protected by Cloud KMS, the Customer-Managed Encryption Key field will display the key resource ID.

    Protected table

Change the Cloud KMS key for a BigQuery table

To change the Cloud KMS key of an existing CMEK-protected table, you can use the API or the bq command-line tool. There are two ways to modify the Cloud KMS key: update or cp. If you use update, you can change the Cloud KMS key used for a KMS-protected table. If you use cp, you can change the Cloud KMS key used for a CMEK-protected table, change a table from default encryption to CMEK-protection, also change a table from CMEK-protection to default encryption. An advantage of update is it is faster than cp and it allows the use of table decorators.

command line

You can use the bq cp command with the --destination_kms_key flag to change the key for a table protected by Cloud KMS. The --destination_kms_key flag specifies the resource ID of the key to use with the table. This key is in the form:

projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY]

For information on how to retrieve the key resource ID, see Key resource ID.

bq update \
--destination_kms_key projects/[PROJECT_ID]/locations/[LOCATION]/keyRings/[KEYRING]/cryptoKeys/[KEY] \
-t [DATASET_ID].[TABLE_ID]

Go

// To run this sample, you will need to create (or reuse) a context and
// an instance of the bigquery client.  For example:
// import "cloud.google.com/go/bigquery"
// ctx := context.Background()
// client, err := bigquery.NewClient(ctx, "your-project-id")
tableRef := client.Dataset(datasetID).Table(tableID)
meta, err := tableRef.Metadata(ctx)
if err != nil {
	return err
}
update := bigquery.TableMetadataToUpdate{
	EncryptionConfig: &bigquery.EncryptionConfig{
		// TODO: Replace this key with a key you have created in Cloud KMS.
		KMSKeyName: "projects/cloud-samples-tests/locations/us-central1/keyRings/test/cryptoKeys/otherkey",
	},
}
if _, err := tableRef.Update(ctx, update, meta.ETag); err != nil {
	return err
}

Python

Change the customer managed encryption key for a table by changing the Table.encryption_configuration property to a new EncryptionConfiguration object and update the table.

# from google.cloud import bigquery
# client = bigquery.Client()

assert table.encryption_configuration.kms_key_name == original_kms_key_name

# Set a new encryption key to use for the destination.
# TODO: Replace this key with a key you have created in KMS.
updated_kms_key_name = (
    'projects/cloud-samples-tests/locations/us-central1/'
    'keyRings/test/cryptoKeys/otherkey')
table.encryption_configuration = bigquery.EncryptionConfiguration(
    kms_key_name=updated_kms_key_name)

table = client.update_table(
    table, ['encryption_configuration'])  # API request

assert table.encryption_configuration.kms_key_name == updated_kms_key_name
assert original_kms_key_name != updated_kms_key_name

Remove BigQuery's access to the Cloud KMS key

You can remove BigQuery's access to the Cloud KMS key at any time, by revoking the IAM permission for that key.

If BigQuery loses access to the Cloud KMS key, the user experience can suffer significantly and data loss may occur:

  • Data in these CMEK-protected tables can no longer be accessed — query, cp, extract, and tabledata.list will all fail.

  • No new data can be added to these CMEK-protected tables.

  • Even once access is granted back, the performance of queries to these tables can be degraded for multiple days.

Limitations

BigQuery access to the Cloud KMS key

A Cloud KMS key is considered available and accessible by BigQuery if:

  • the key is enabled
  • the BigQuery service account has encrypt and decrypt permissions on the key

The following sections describe impact to streaming inserts and long-term inaccessible data when a key is inaccessible.

Impact to streaming inserts

The Cloud KMS key must be available and accessible for at least 24 consecutive hours in the 48-hour period following a streaming insertion request. If the key is not available and accessible, the streamed data may not be fully persisted and can be lost. For more information on streaming inserts, see Streaming Data into BigQuery.

Impact to long-term inaccessible data

As BigQuery provides managed storage, long-term inaccessible data is not compatible with BigQuery's architecture. If the Cloud KMS key of a given BigQuery table is not available and not accessible for 60 consecutive days, BigQuery may choose to delete the table and its associated data. Before the data is deleted, BigQuery will send an email to the email address associated with the billing account at least 7 days before the deletion.

Using table decorators

When data in a table protected by Cloud KMS is replaced via write disposition WRITE_TRUNCATE for load, cp, or query operations, the table becomes inaccessible for query via table decorators depending on the snapshot decorator time.

Assuming a table was replaced at time T and snapshot decorator snapshot_time is for a time less than T, the following table shows whether you can query for snapshot_time:

Encryption type before T Encryption type after T snapshot_time
Cloud KMS encrypted Cloud KMS encrypted Cannot query
Default encrypted Cloud KMS encrypted Can query
Cloud KMS encrypted Default encrypted Cannot query

Note that similar logic applies to <time2> when a range decorator is used.

Frequently asked questions

Who needs permission to the Cloud KMS key?

With customer-managed encryption keys, specifying permissions repeatedly is not required. As long as the BigQuery service account has permission to use the Cloud KMS key to encrypt and decrypt, anyone with permission to the BigQuery table can access the data — even if they don't have direct access to the Cloud KMS key.

Which service account is used?

The BigQuery service account associated with the GCP project of the table is used to decrypt that table's data. The BigQuery service accounts are unique for each project. For a job that writes data into a Cloud KMS-protected anonymous table, the job's project's service account is used.

As an example, consider three CMEK-protected tables: table1, table2, and table3. To query data from {project1.table1, project2.table2} with destination table {project3.table3}:

  • Use the project1 service account for project1.table1
  • Use the project2 service account for project2.table2
  • Use the project3 service account for project3.table3

In what ways can BigQuery use my Cloud KMS key?

BigQuery will use the Cloud KMS key to decrypt data in response to a user query, for example, Tabledata.list or jobs.insert.

BigQuery can also use the key for data maintenance and storage optimization tasks, like data conversion into a read-optimized format.

How to get more help?

If you have questions that are not answered here, reach out to cmek-feedback@google.com, see BigQuery support, or see Cloud KMS support.

Troubleshooting errors

The following describes common errors and recommended mitigations.

Error Recommendation
Please grant Cloud KMS CryptoKey Encrypter/Decrypter role The BigQuery service account associated with your project does not have sufficient Cloud IAM permission to operate on the specified Cloud KMS key. Follow the instructions in the error or in this documentation to grant the proper Cloud IAM permission.
Existing table encryption settings do not match encryption settings specified in the request This can occur in scenarios where the destination table has encryption settings that do not match the encryption settings in your request. As mitigation, use write disposition TRUNCATE to replace the table, or specify a different destination table.
This region is not supported The region of the Cloud KMS key does not match the region of the BigQuery dataset of the destination table. As a mitigation, select a key in a region that matches your dataset, or load into a dataset that matches the key region.

Known issues

  • BigQuery client libraries do not yet support commands for configuring customer-managed encryption keys.
Was this page helpful? Let us know how we did:

Send feedback about...

Need help? Visit our support page.