Bigtable Audit logging information

This page describes the audit logs created by Cloud Bigtable as part of Cloud Audit Logs.

Overview

Google Cloud services write audit logs to help you answer the questions, "Who did what, where, and when?" within your Google Cloud resources.

Your Google Cloud projects contain only the audit logs for resources that are directly within the Cloud project. Other Google Cloud resources, such as folders, organizations, and billing accounts, contain the audit logs for the entity itself.

For a general overview of Cloud Audit Logs, see Cloud Audit Logs overview. For a deeper understanding of the audit log format, see Understand audit logs.

Available audit logs

The following types of audit logs are available for Bigtable:

  • Admin Activity audit logs

    Includes "admin write" operations that write metadata or configuration information.

    You can't disable Admin Activity audit logs.

  • Data Access audit logs

    Includes "admin read" operations that read metadata or configuration information. Also includes "data read" and "data write" operations that read or write user-provided data.

    To receive Data Access audit logs, you must explicitly enable them.

For fuller descriptions of the audit log types, see Types of audit logs.

Audited operations

The following summarizes which API operations correspond to each audit log type in Bigtable:

Admin Activity audit logging

Resource type Bigtable operation Log type
AppProfile CreateAppProfile ADMIN_WRITE
GetAppProfile ADMIN_READ
ListAppProfiles ADMIN_READ
UpdateAppProfile ADMIN_WRITE
EnableAppProfile ADMIN_WRITE
DisableAppProfile ADMIN_WRITE
DeleteAppProfile ADMIN_WRITE
Backup CreateBackup ADMIN_WRITE
UpdateBackup ADMIN_WRITE
DeleteBackup ADMIN_WRITE
RestoreTable ADMIN_WRITE
GetBackup ADMIN_READ
ListBackups ADMIN_READ
GetIamPolicy ADMIN_READ
SetIamPolicy ADMIN_WRITE
Cluster CreateCluster ADMIN_WRITE
UpdateCluster ADMIN_WRITE
PartialUpdateCluster ADMIN_WRITE
DeleteCluster ADMIN_WRITE
GetCluster ADMIN_READ
ListClusters ADMIN_READ
Instance CreateInstance ADMIN_WRITE
UpdateInstance ADMIN_WRITE
PartialUpdateInstance ADMIN_WRITE
DeleteInstance ADMIN_WRITE
GetInstance ADMIN_READ
ListInstances ADMIN_READ
SetIamPolicy ADMIN_WRITE
GetIamPolicy ADMIN_READ
Table CreateTable ADMIN_WRITE
DeleteTable ADMIN_WRITE
ModifyColumnFamilies ADMIN_WRITE
ListTables ADMIN_READ
GetTable ADMIN_READ
GenerateConsistencyToken ADMIN_READ
CheckConsistency ADMIN_READ
GetIamPolicy ADMIN_READ
SetIamPolicy ADMIN_WRITE
CheckAndMutateRow ADMIN_WRITE

Data Access audit logging

Resource type Bigtable operation Log type
  1. Admin API operation that is audited as part of Data Access audit logging.
Backup CreateBackup DATA READ
Table ReadRows DATA_READ
SampleRowKeys DATA_READ
MutateRow DATA_WRITE
MutateRows DATA_WRITE
CheckAndMutateRow DATA_READ
ReadModifyWriteRow DATA_WRITE
DropRowRange1 DATA_WRITE

If you previously enabled Data Access audit logs for all Google Cloud services in the Cloud Audit Logs default configuration, you might need to take additional steps to enable Data Access audit logging for Bigtable. Affected customers see a notification at the top of the Cloud Console Bigtable page.

For details about which fields are logged for Data Access audit logging, see Audit log fields.

Audit log format

Audit log entries include the following objects:

  • The log entry itself, which is an object of type LogEntry. Useful fields include the following:

    • The logName contains the resource ID and audit log type.
    • The resource contains the target of the audited operation.
    • The timeStamp contains the time of the audited operation.
    • The protoPayload contains the audited information.
  • The audit logging data, which is an AuditLog object held in the protoPayload field of the log entry.

  • Optional service-specific audit information, which is a service-specific object. For older integrations, this object is held in the serviceData field of the AuditLog object; newer integrations use the metadata field.

For other fields in these objects, and how to interpret them, review Understand audit logs.

Log name

Cloud Audit Logs resource names indicate the Cloud project or other Google Cloud entity that owns the audit logs, and whether the log contains Admin Activity, Data Access, Policy Denied, or System Event audit logging data. For example, the following shows log names for project-level Admin Activity audit logs and an organization's Data Access audit logs. The variables denote Cloud project and organization identifiers.

projects/PROJECT_ID/logs/cloudaudit.googleapis.com%2Factivity
organizations/ORGANIZATION_ID/logs/cloudaudit.googleapis.com%2Fdata_access

Service name

Bigtable audit logs use the service name bigtableadmin.googleapis.com for admin operations and bigtable.googleapis.com for data operations.

For a full list of all the Cloud Logging API service names and their corresponding monitored resource type, see Map services to resources.

Resource types

Bigtable audit logs use the resource types bigtable_instance, bigtable_cluster, bigtable_table, and bigtable_backup. In addition, for IAM operations, audit logs use the resource type audited_resource.

For a list of all the Cloud Logging monitored resource types and descriptive information, see Monitored resource types.

Enable audit logging

Admin Activity audit logs are always enabled; you can't disable them.

Data Access audit logs are disabled by default and aren't written unless explicitly enabled (the exception is Data Access audit logs for BigQuery, which can't be disabled).

For instructions on enabling some or all of your Data Access audit logs, see Configure Data Access logs.

Permissions and roles

IAM permissions and roles determine your ability to access audit logs data in Google Cloud resources.

When deciding which Logging-specific permissions and roles apply to your use case, consider the following:

  • The Logs Viewer role (roles/logging.viewer) gives you read-only access to Admin Activity, Policy Denied, and System Event audit logs. If you have just this role, you cannot view Data Access audit logs that are in the _Required and _Default buckets.

  • The Private Logs Viewer role(roles/logging.privateLogViewer) includes the permissions contained in roles/logging.viewer, plus the ability to read and Data Access audit logs in the _Required and _Default buckets.

    Note that if these private logs are stored in user-defined buckets, then any user who has permissions to read logs in those buckets can read the private logs. For more information on log buckets, see Routing and storage overview.

For more information on the IAM permissions and roles that apply to audit logs data, see Access control.

View logs

To find and view audit logs, you need to know the identifier of the Cloud project, folder, or organization for which you want to view audit logging information. You can further specify other indexed LogEntry fields, like resource.type; for details, review Find log entries quickly.

The following are the audit log names; they include variables for the identifiers of the Cloud project, folder, or organization:

   projects/PROJECT_ID/logs/cloudaudit.googleapis.com%2Factivity
   projects/PROJECT_ID/logs/cloudaudit.googleapis.com%2Fdata_access
   projects/PROJECT_ID/logs/cloudaudit.googleapis.com%2Fsystem_event
   projects/PROJECT_ID/logs/cloudaudit.googleapis.com%2Fpolicy

   folders/FOLDER_ID/logs/cloudaudit.googleapis.com%2Factivity
   folders/FOLDER_ID/logs/cloudaudit.googleapis.com%2Fdata_access
   folders/FOLDER_ID/logs/cloudaudit.googleapis.com%2Fsystem_event
   folders/FOLDER_ID/logs/cloudaudit.googleapis.com%2Fpolicy

   organizations/ORGANIZATION_ID/logs/cloudaudit.googleapis.com%2Factivity
   organizations/ORGANIZATION_ID/logs/cloudaudit.googleapis.com%2Fdata_access
   organizations/ORGANIZATION_ID/logs/cloudaudit.googleapis.com%2Fsystem_event
   organizations/ORGANIZATION_ID/logs/cloudaudit.googleapis.com%2Fpolicy

You can view audit logs in Cloud Logging using the Cloud Console, the gcloud command-line tool, or the Logging API.

Console

You can use the Logs Explorer in the Cloud Console to retrieve your audit log entries for your Cloud project, folder, or organization:

  1. In the Cloud Console, go to the Logging > Logs Explorer page.

    Go to the Logs Explorer page

  2. On the Logs Explorer page, select an existing Cloud project, folder or organization.

  3. In the Query builder pane, do the following:

    • In Resource type, select the Google Cloud resource whose audit logs you want to see.

    • In Log name, select the audit log type that you want to see:

      • For Admin Activity audit logs, select activity.
      • For Data Access audit logs, select data_access.
      • For System Event audit logs, select system_event.
      • For Policy Denied audit logs, select policy.

    If you don't see these options, then there aren't any audit logs of that type available in the Cloud project, folder, or organization.

    For more details about querying using the Logs Explorer, see Build log queries.

gcloud

The gcloud command-line tool provides a command-line interface to the Cloud Logging API. Supply a valid PROJECT_ID, FOLDER_ID, or ORGANIZATION_ID in each of the log names.

To read your Cloud project-level audit log entries, run the following command:

gcloud logging read "logName : projects/PROJECT_ID/logs/cloudaudit.googleapis.com" --project=PROJECT_ID

To read your folder-level audit log entries, run the following command:

gcloud logging read "logName : folders/FOLDER_ID/logs/cloudaudit.googleapis.com" --folder=FOLDER_ID

To read your organization-level audit log entries, run the following command:

gcloud logging read "logName : organizations/ORGANIZATION_ID/logs/cloudaudit.googleapis.com" --organization=ORGANIZATION_ID

For more information about using the gcloud tool, see Read log entries.

API

When building your queries, replace the variables with valid values, substitute the appropriate project-level, folder-level, or organization-level audit log name or identifiers as listed in the audit log names. For example, if your query includes a PROJECT_ID, then the project identifier you supply must refer to the currently selected Cloud project.

To use the Logging API to look at your audit log entries, do the following:

  1. Go to the Try this API section in the documentation for the entries.list method.

  2. Put the following into the Request body part of the Try this API form. Clicking on this prepopulated form automatically fills the request body, but you need to supply a valid PROJECT_ID in each of the log names.

    {
      "resourceNames": [
        "projects/PROJECT_ID"
      ],
      "pageSize": 5,
      "filter": "logName : projects/PROJECT_ID/logs/cloudaudit.googleapis.com"
    }
    
  3. Click Execute.

For more details about querying, see Logging query language.

For an example of an audit log entry and how to find the most important information in it, see Sample audit log entry.

Route audit logs

You can route audit logs to supported destinations in the same way that you can route other kinds of logs. Here are some reasons you might want to route your audit logs:

  • To keep audit logs for a longer period of time or to use more powerful search capabilities, you can route copies of your audit logs to Cloud Storage, BigQuery, or Pub/Sub. Using Pub/Sub, you can route to other applications, other repositories, and to third parties.

  • To manage your audit logs across an entire organization, you can create aggregated sinks that can route logs from any or all Cloud projects in the organization.

  • If your enabled Data Access audit logs are pushing your Cloud projects over your log allotments, you can create sinks that exclude the Data Access audit logs from Logging.

For instructions on routing logs, see Configure sinks.

Pricing

Admin Activity audit logs and System Event audit logs are free.

Data Access audit logs and Policy Denied audit logs are chargeable.

For more information about Cloud Logging pricing, see Google Cloud's operations suite pricing: Cloud Logging.

Managing costs

Bigtable is typically used for large, high-volume workloads. As a result, if you don't manage the log volume, Bigtable can generate an extremely high number of DATA_READ and DATA_WRITE logs, leading to unexpectedly high log storage costs. If you use Data Access audit logging, you should take steps to manage the log volume.

When you follow the best practices for Bigtable authentication, most Data Access audit log activity is generated by service accounts. A service account is an account that an application uses to authenticate and make API calls to Google Cloud services such as Bigtable. Managing service account logs is the most important step to reduce log volume. You might want to also limit logs using other criteria.

You can enable Data Access audit logging for Bigtable in the following ways:

After you enable logging, take the following steps to restrict the volume of logs.

Identify service accounts

First, identify the service accounts that you don't need logs for. The list of service accounts that are not useful and should not be logged depends on your application and business needs.To get a list of service accounts that have Bigtable Data API permissions, you can search IAM policies for your organization. You can also view them on the IAM Permissions Cloud Console page on the Members tab.

Set up log restrictions

Next, set up your log restrictions. There are two ways to manage your Bigtable log volume by limiting service account logs. You can either exempt service accounts using audit configuration, or you can exclude service account logs using logs exclusion filters. For each method, you can either use the Cloud Logging API or the Google Cloud Console.

Exempting service accounts using audit configuration

Exempting service accounts using audit configuration is the recommended approach because it lets you prevent certain logs from being generated in the first place. For detailed instructions, see the following:

Excluding service accounts using logs exclusion filters

Logs exclusions let you specify logs to be excluded from ingestion into your logs buckets. In this approach, logs are discarded after they have been created, so they still impose a processing load on the Bigtable service components that serve your data. Because of this load, we recommend that you use audit configuration instead. For more information on setting up filters, see the following: