Bigtable audit logging information

This document describes the audit logs created by Bigtable as part of Cloud Audit Logs.


Google Cloud services write audit logs to help you answer the questions, "Who did what, where, and when?" within your Google Cloud resources.

Your Google Cloud projects contain only the audit logs for resources that are directly within the Google Cloud project. Other Google Cloud resources, such as folders, organizations, and billing accounts, contain the audit logs for the entity itself.

For a general overview of Cloud Audit Logs, see Cloud Audit Logs overview. For a deeper understanding of the audit log format, see Understand audit logs.

Available audit logs

The following types of audit logs are available for Bigtable:

  • Admin Activity audit logs

    Includes "admin write" operations that write metadata or configuration information.

    You can't disable Admin Activity audit logs.

  • Data Access audit logs

    Includes "admin read" operations that read metadata or configuration information. Also includes "data read" and "data write" operations that read or write user-provided data.

    To receive Data Access audit logs, you must explicitly enable them.

  • System Event audit logs

    Identifies automated Google Cloud actions that modify the configuration of resources.

    You can't disable System Event audit logs.

For fuller descriptions of the audit log types, see Types of audit logs.

Audited operations

The following table summarizes which API operations correspond to each audit log type in Bigtable:

Admin Activity audit logging

Resource type Bigtable operation Log type
AppProfile CreateAppProfile ADMIN_WRITE
UpdateAppProfile ADMIN_WRITE
EnableAppProfile ADMIN_WRITE
DisableAppProfile ADMIN_WRITE
DeleteAppProfile ADMIN_WRITE
Backup CreateBackup ADMIN_WRITE
UpdateBackup ADMIN_WRITE
DeleteBackup ADMIN_WRITE
RestoreTable ADMIN_WRITE
Cluster CreateCluster ADMIN_WRITE
UpdateCluster ADMIN_WRITE
PartialUpdateCluster ADMIN_WRITE
DeleteCluster ADMIN_WRITE
Instance CreateInstance ADMIN_WRITE
UpdateInstance ADMIN_WRITE
PartialUpdateInstance ADMIN_WRITE
DeleteInstance ADMIN_WRITE
Table CreateTable ADMIN_WRITE
ModifyColumnFamilies ADMIN_WRITE
CheckAndMutateRow ADMIN_WRITE

Data Access audit logging

Resource type Bigtable operation Log type
GetAppProfile ADMIN_READ
ListAppProfiles ADMIN_READ
Backup CreateBackup DATA_READ
ListBackups ADMIN_READ
Cluster GetCluster ADMIN_READ
ListClusters ADMIN_READ
Instance GetInstance ADMIN_READ
ListInstances ADMIN_READ
Table ListTables ADMIN_READ
GenerateConsistencyToken ADMIN_READ
CheckConsistency ADMIN_READ
SampleRowKeys DATA_READ
CheckAndMutateRow DATA_READ
ReadModifyWriteRow DATA_WRITE or

If you previously enabled Data Access audit logs for all Google Cloud services in the Cloud Audit Logs default configuration, you might need to take additional steps to enable Data Access audit logging for Bigtable. Affected customers see a notification at the top of the Google Cloud console Bigtable page.

For details about which fields are logged for Data Access audit logging, see Audit log fields.

System Event audit logging

Resource type Bigtable operation
Audited resource AutoscaleCluster

Audit log format

Audit log entries include the following objects:

  • The log entry itself, which is an object of type LogEntry. Useful fields include the following:

    • The logName contains the resource ID and audit log type.
    • The resource contains the target of the audited operation.
    • The timeStamp contains the time of the audited operation.
    • The protoPayload contains the audited information.
  • The audit logging data, which is an AuditLog object held in the protoPayload field of the log entry.

  • Optional service-specific audit information, which is a service-specific object. For earlier integrations, this object is held in the serviceData field of the AuditLog object; later integrations use the metadata field.

For other fields in these objects, and how to interpret them, review Understand audit logs.

Log name

Cloud Audit Logs log names include resource identifiers indicating the Google Cloud project or other Google Cloud entity that owns the audit logs, and whether the log contains Admin Activity, Data Access, Policy Denied, or System Event audit logging data.

The following are the audit log names, including variables for the resource identifiers:





Service name

Bigtable audit logs use the service name for admin operations and for data operations.

For a list of all the Cloud Logging API service names and their corresponding monitored resource type, see Map services to resources.

Resource types

Bigtable audit logs use the following resource types:

  • bigtable_instance
  • bigtable_cluster
  • bigtable_table
  • bigtable_backup

In addition, for IAM operations, audit logs use the resource type audited_resource.

For a list of all the Cloud Logging monitored resource types and descriptive information, see Monitored resource types.

Caller identities

The IP address of the caller is held in the RequestMetadata.caller_ip field of the AuditLog object. Logging might redact certain caller identities and IP addresses.

For information about what information is redacted in audit logs, see Caller identities in audit logs.

Enable audit logging

System Event audit logs are always enabled; you can't disable them.

Admin Activity audit logs are always enabled; you can't disable them.

Data Access audit logs are disabled by default and aren't written unless explicitly enabled (the exception is Data Access audit logs for BigQuery, which can't be disabled).

For information about enabling some or all of your Data Access audit logs, see Enable Data Access audit logs.

Permissions and roles

IAM permissions and roles determine your ability to access audit logs data in Google Cloud resources.

When deciding which Logging-specific permissions and roles apply to your use case, consider the following:

  • The Logs Viewer role (roles/logging.viewer) gives you read-only access to Admin Activity, Policy Denied, and System Event audit logs. If you have just this role, you cannot view Data Access audit logs that are in the _Default bucket.

  • The Private Logs Viewer role(roles/logging.privateLogViewer) includes the permissions contained in roles/logging.viewer, plus the ability to read Data Access audit logs in the _Default bucket.

    Note that if these private logs are stored in user-defined buckets, then any user who has permissions to read logs in those buckets can read the private logs. For more information about log buckets, see Routing and storage overview.

For more information about the IAM permissions and roles that apply to audit logs data, see Access control with IAM.

View logs

You can query for all audit logs or you can query for logs by their audit log name. The audit log name includes the resource identifier of the Google Cloud project, folder, billing account, or organization for which you want to view audit logging information. Your queries can specify indexed LogEntry fields, and if you use the Log Analytics page, which supports SQL queries, then you can view your query results as a chart.

For more information about querying your logs, see the following pages:

You can view audit logs in Cloud Logging by using the Google Cloud console, the Google Cloud CLI, or the Logging API.


In the Google Cloud console, you can use the Logs Explorer to retrieve your audit log entries for your Google Cloud project, folder, or organization:

  1. In the navigation panel of the Google Cloud console, select Logging, and then select Logs Explorer:

    Go to Logs Explorer

  2. Select an existing Google Cloud project, folder, or organization.

  3. To display all audit logs, enter either of the following queries into the query-editor field, and then click Run query:

  4. To display the audit logs for a specific resource and audit log type, in the Query builder pane, do the following:

    • In Resource type, select the Google Cloud resource whose audit logs you want to see.

    • In Log name, select the audit log type that you want to see:

      • For Admin Activity audit logs, select activity.
      • For Data Access audit logs, select data_access.
      • For System Event audit logs, select system_event.
      • For Policy Denied audit logs, select policy.
    • Click Run query.

    If you don't see these options, then there aren't any audit logs of that type available in the Google Cloud project, folder, or organization.

    If you're experiencing issues when trying to view logs in the Logs Explorer, see the troubleshooting information.

    For more information about querying by using the Logs Explorer, see Build queries in the Logs Explorer. For information about summarizing log entries in the Logs Explorer by using Duet AI, see Summarize log entries with Duet AI assistance.


The Google Cloud CLI provides a command-line interface to the Logging API. Supply a valid resource identifier in each of the log names. For example, if your query includes a PROJECT_ID, then the project identifier you supply must refer to the currently selected Google Cloud project.

To read your Google Cloud project-level audit log entries, run the following command:

gcloud logging read "logName : projects/PROJECT_ID/logs/" \

To read your folder-level audit log entries, run the following command:

gcloud logging read "logName : folders/FOLDER_ID/logs/" \

To read your organization-level audit log entries, run the following command:

gcloud logging read "logName : organizations/ORGANIZATION_ID/logs/" \

To read your Cloud Billing account-level audit log entries, run the following command:

gcloud logging read "logName : billingAccounts/BILLING_ACCOUNT_ID/logs/" \

Add the --freshness flag to your command to read logs that are more than 1 day old.

For more information about using the gcloud CLI, see gcloud logging read.


When building your queries, supply a valid resource identifier in each of the log names. For example, if your query includes a PROJECT_ID, then the project identifier you supply must refer to the currently selected Google Cloud project.

For example, to use the Logging API to view your project-level audit log entries, do the following:

  1. Go to the Try this API section in the documentation for the entries.list method.

  2. Put the following into the Request body part of the Try this API form. Clicking this prepopulated form automatically fills the request body, but you need to supply a valid PROJECT_ID in each of the log names.

      "resourceNames": [
      "pageSize": 5,
      "filter": "logName : projects/PROJECT_ID/logs/"
  3. Click Execute.

Route audit logs

You can route audit logs to supported destinations in the same way that you can route other kinds of logs. Here are some reasons you might want to route your audit logs:

  • To keep audit logs for a longer period of time or to use more powerful search capabilities, you can route copies of your audit logs to Cloud Storage, BigQuery, or Pub/Sub. Using Pub/Sub, you can route to other applications, other repositories, and to third parties.

  • To manage your audit logs across an entire organization, you can create aggregated sinks that can route logs from any or all Google Cloud projects in the organization.

  • If your enabled Data Access audit logs are pushing your Google Cloud projects over your log allotments, you can create sinks that exclude the Data Access audit logs from Logging.

For instructions about routing logs, see Route logs to supported destinations.


For more information about pricing, see Cloud Logging pricing summary.

Split audit log entries

When a log entry exceeds the size limit, Cloud Logging splits that entry and distributes the data across several entries. To learn how to identify and reassemble split audit logs, see Split audit log entries.

Managing costs

Bigtable is typically used for large, high-volume workloads. As a result, if you don't manage the log volume, Bigtable can generate an extremely high number of DATA_READ and DATA_WRITE logs, leading to unexpectedly high log storage costs. If you use Data Access audit logging, you should take steps to manage the log volume.

When you follow the best practices for Bigtable authentication, most Data Access audit log activity is generated by service accounts. A service account is an account that an application uses to authenticate and make API calls to Google Cloud services such as Bigtable. Managing service account logs is the most important step to reduce log volume. You might want to also limit logs using other criteria.

You can enable Data Access audit logging for Bigtable in the following ways:

After you enable audit logging, take the following steps to restrict the volume of logs.

Identify service accounts

First, identify the service accounts that you don't need logs for. The list of service accounts that are not useful and should not be logged depends on your application and business needs.To get a list of service accounts that have Cloud Bigtable API (Data API) permissions, you can search IAM policies for your organization. You can also view them on the IAM Permissions Google Cloud console page on the Principals tab.

Set up log restrictions

Next, set up your log restrictions. There are two ways to manage your Bigtable log volume by limiting service account logs. You can either exempt service accounts using audit configuration, or you can exclude service account logs using logs exclusion filters. For each method, you can either use the Cloud Logging API or the Google Cloud console.

Exempt service accounts using audit configuration

Exempting service accounts using audit configuration is the recommended approach because it lets you prevent certain logs from being generated in the first place. For detailed instructions, see the following:

Exclude service accounts using exclusion filters

Exclusion filters let you specify logs to be excluded from ingestion into your logs buckets. In this approach, logs are discarded after they have been created, so they still impose a processing load on the Bigtable service components that serve your data. Because of this load, we recommend that you use audit configuration instead. For more information on setting up filters using the Google Cloud console and the API, see Create a sink.