首先,找出不需要記錄檔的服務帳戶。不實用且不應記錄的服務帳戶清單,取決於您的應用程式和業務需求。如要取得具有 Cloud Bigtable API (資料 API) 權限的服務帳戶清單,您可以搜尋機構的 IAM 政策。您也可以在「主體」分頁的 IAM 權限 Google Cloud 控制台頁面中查看這些角色。
設定記錄限制
接著,請設定記錄限制。您可以透過兩種方式,限制服務帳戶記錄,藉此管理 Bigtable 記錄量。您可以透過稽核設定豁免服務帳戶,也可以使用記錄排除篩選器排除服務帳戶記錄。您可以使用 Cloud Logging API 或 Google Cloud 控制台,透過上述任一方法匯出記錄。
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-04 (世界標準時間)。"],[[["\u003cp\u003eBigtable's high-volume workloads can generate a large number of DATA_READ and DATA_WRITE logs, leading to unexpectedly high log storage costs if not managed properly.\u003c/p\u003e\n"],["\u003cp\u003eManaging service account logs is crucial for reducing log volume, as most Data Access audit log activity is generated by service accounts when following Bigtable authentication best practices.\u003c/p\u003e\n"],["\u003cp\u003eThe recommended approach to control log volume is to exempt service accounts from logging using audit configuration, as it prevents logs from being generated in the first place, unlike exclusion filters that discard logs after they're created.\u003c/p\u003e\n"],["\u003cp\u003eBefore enabling Data Access audit logging for Bigtable, it is important to estimate the potential Cloud Audit Logs ingestion and storage costs, as these costs are directly tied to the volume of Bigtable requests logged each month.\u003c/p\u003e\n"],["\u003cp\u003eYou can estimate your potential monthly log volume and costs by calculating the average requests per second multiplied by the seconds in a month, while taking into consideration free data and storage.\u003c/p\u003e\n"]]],[],null,["# Manage Data Access audit log costs\n==================================\n\n| **Key Point:** To prevent unexpected high costs, restrict service account logging.\n\nBigtable is typically used for large, high-volume workloads. As a\nresult, if you don't manage the log volume, Bigtable can generate\nan **extremely high number** of DATA_READ and DATA_WRITE logs, leading to\nunexpectedly high [log storage costs](/stackdriver/pricing#logging-costs). If you use Data Access\naudit logging, you should take steps to manage the log volume.\n\nWhen you follow the best practices for [Bigtable authentication](/bigtable/docs/authentication), most Data Access audit log activity is generated by *service\naccounts* . A [service account](/iam/docs/service-accounts) is an account that an\napplication uses to authenticate and make API calls to Google Cloud\nservices such as Bigtable. Managing service account logs is the\nmost important step to reduce log volume. You might want to also limit logs\nusing other criteria.\n\nYou can enable Data Access audit logging for Bigtable in the\nfollowing ways:\n\n- Using the Google Cloud console\n - [Individual services](/logging/docs/audit/configure-data-access#config-console) (for example, only Bigtable)\n - [Default config](/logging/docs/audit/configure-data-access#config-console-default) all services in a Google Cloud project (not just Bigtable)\n- [Using the Cloud Logging API](/logging/docs/audit/configure-data-access#config-api)\n\nAfter you enable audit logging, take the following steps to restrict the volume\nof logs.\n\nIdentify service accounts\n-------------------------\n\nFirst, identify the service accounts that you don't need logs for. The list of\nservice accounts that are not useful and should not be logged depends on your\napplication and business needs.To get a list of service accounts that have\nCloud Bigtable API (Data API) permissions, you can [search IAM\npolicies](/asset-inventory/docs/searching-iam-policies) for your organization. You can also view them on\nthe [IAM Permissions](https://console.cloud.google.com/iam-admin/iam) Google Cloud console\npage on the **Principals** tab.\n\nSet up log restrictions\n-----------------------\n\nNext, set up your log restrictions. There are two ways to manage your\nBigtable log volume by limiting service account logs. You can\neither *exempt* service accounts using audit configuration, or you can *exclude*\nservice account logs using logs exclusion filters. For each method, you can\neither use the [Cloud Logging API](/logging/docs/reference/api-overview)\nor the Google Cloud console.\n\n### Exempt service accounts using audit configuration\n\nExempting service accounts using [audit configuration](/logging/docs/audit/configure-data-access) is the\n**recommended approach** because it lets you prevent certain logs from being\ngenerated in the first place. For detailed instructions, see the following:\n\n- [Configuring Data Access audit logs with the API](/logging/docs/audit/configure-data-access#config-api)\n- [Configuring Data Access audit logs with the Google Cloud console](/logging/docs/audit/configure-data-access#config-console-exempt)\n\n### Exclude service accounts using exclusion filters\n\n[Exclusion filters](/logging/docs/routing/overview#exclusions) let you specify\nlogs to be excluded from ingestion into your logs buckets. In this approach,\nlogs are discarded *after they have been created* , so they still impose a\nprocessing load on the Bigtable service components that serve\nyour data. Because of this load, we recommend that you use audit configuration\ninstead. For more information on setting up filters using the\nGoogle Cloud console and the API, see [Create a sink](/logging/docs/export/configure_export_v2#creating_sink).\n\nEstimate Data Access audit log costs\n------------------------------------\n\nBecause Bigtable is typically used for large, high-volume workloads,\nit has the potential to generate an extremely high number of logs. Before you\nenable Data Access audit logging for Bigtable, you should estimate and\nunderstand the [Cloud Audit Logs](/logging/docs/audit) ingestion and storage costs\nthat audit logging can incur each month.\n\nYour Data Access audit logging costs are directly related to the number of\nBigtable requests that you choose to log each month. The\nfollowing table shows rough estimates of the [Cloud Audit Logs costs](/stackdriver/pricing) that you can expect based on\nyour average requests per second and the length of time that you store your\nlogs, assuming that you log all data requests. See [Calculating your costs](#calculating-costs) for a detailed explanation of how these estimates are\ncalculated.\n\nCalculate your costs\n--------------------\n\nStart with the following assumptions:\n\n- The number of seconds in an average month is about 2,628,000.\n- The average audit size is around 1 kb.\n- You are not charged for the first 50 GiB of audit logs that are ingested per month, and after you pass that amount, you're charged $0.50/GiB.\n- Storage is free for 30 days; after that you're charged $0.01/GiB for storage.\n\nThe method described on this page provides a gross estimate based on all\ntraffic. In production, you are encouraged to\n[restrict service account logging](#identify-accounts).\n| **Note:** To keep calculations simple, this page estimates log volume in GB, then converts to GiB. Cloud Logging is billed per GiB.\n\n### Calculate your monthly log volume\n\nFirst, estimate the average amount of logs your traffic will generate in an\naverage month.\n\n1. Gather the *average number of requests per second* that your application sends to Bigtable over the course of a month.\n - If you use client-side metrics, you can use them to determine your average queries per second (QPS) for the last month.\n - If you prefer to use your instance's system insights page in the Google Cloud console, use it to determine the average values for read requests and write requests per second over the last month, then add those two values together.\n2. Multiply the requests per second by 2,628,000 to get the *average\n requests per month*.\n3. Divide that number by 10e6, or 1,000,000. The result is the *estimated\n monthly log volume in GB* that you might generate each month.\n4. Multiply the monthly log volume in GB by .93 to get the approximate *monthly log volume in GiB*.\n\n### Calculate your ingestion costs\n\n1. Subtract 50 GiB from the monthly log volume in GiB that you calculated. There is no charge for the first 50 GiB.\n2. Multiply the remainder by $0.50 to arrive at your estimated monthly ingestion costs.\n\n### Calculate your storage costs\n\n1. If you plan to let your logs expire after 30 days, your cost for storage is $0.00.\n2. If you store your logs for longer than 30 days, your storage costs can be estimated by multiplying the monthly log volume by $0.01. These costs start to incur after the first month.\n\nDetailed example\n----------------\n\n### 5,000 request per second, logs retained for 90 days\n\nIn this example, suppose that your average number of requests per second is\n5,000 and you plan to keep your logs for 90 days. Using the steps on this page,\nyou calculate the following estimates:\n\n- Multiply 5,000 by 2,628,000 to arrive at 13,140,000,000 requests per month.\n- Divide 13,140,000,000 by 10e6 to arrive at roughly 13,140 GB of monthly log volume.\n- Convert that number to GiB by multiplying it by .93 to arrive at 12,220.\n- Subtract 50 GiB from your monthly log volume to get 12,170 GiB.\n- Multiply by $0.50 to get $6,085 in ingestion costs.\n- For the first month that your logs exist, the storage cost is $0.\n- The second month, the log storage cost is 12,170 multiplied by $0.01, or about $122.\n- Every month after the second month, the monthly storage cost is double that, or $244.\n- After the second month, your estimated Data Access audit logging costs would be around $6,329 per month.\n\nPresented in equation form, this looks like (((((5,000 rps \\* 2,628,000\nsec)/1,000,000) \\* .93) - 50 GiB) \\* $0.50) + $122 = $6,207.\n\nIn this example, your monthly Data Access logging costs are around **$6,329 per\nmonth**.\n\nWhat's next\n-----------\n\n- [See what a Data Access audit log entry looks like.](/bigtable/docs/audit-log-example)"]]