BigQuery 稽核記錄包含 API 呼叫的記錄項目,但不會說明 API 呼叫的影響。部分 API 呼叫會建立工作 (例如查詢和載入),而 INFORMATION_SCHEMA 檢視畫面會擷取這些工作的資訊。舉例來說,您可以在 INFORMATION_SCHEMA 檢視畫面中查看特定查詢所使用的時間和時段,但稽核記錄中則沒有這類資訊。
{"protoPayload":{"@type":"type.googleapis.com/google.cloud.audit.AuditLog","status":{"code":5,"message":"Not found: Dataset myproject:mydataset was not found in location US"},"authenticationInfo":{...},"requestMetadata":{...},"serviceName":"bigquery.googleapis.com","methodName":"google.cloud.bigquery.v2.JobService.InsertJob","metadata":{},"resource":{"type":"bigquery_project","labels":{..},},"severity":"ERROR","logName":"projects/myproject/logs/cloudaudit.googleapis.com%2Fdata_access",...}
[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-04 (世界標準時間)。"],[[["\u003cp\u003eAudit logs in Google Cloud provide crucial insights into who did what, where, and when within your resources, making them the primary source for audit and security inquiries.\u003c/p\u003e\n"],["\u003cp\u003eBigQuery audit logs record API calls, but for a detailed analysis of BigQuery workloads, \u003ccode\u003eINFORMATION_SCHEMA\u003c/code\u003e views are needed to show the impact of the calls such as time and slot utilization.\u003c/p\u003e\n"],["\u003cp\u003eBigQuery writes audit logs in two formats, an older one that details operations in the \u003ccode\u003eprotoPayload.serviceData\u003c/code\u003e field, and a newer one that details them in \u003ccode\u003eprotoPayload.metadata\u003c/code\u003e, which is the recommended format.\u003c/p\u003e\n"],["\u003cp\u003eAudit log entries follow a structured JSON format, utilizing the \u003ccode\u003eLogEntry\u003c/code\u003e and \u003ccode\u003eAuditLog\u003c/code\u003e structures, with the details of the logged event contained in the \u003ccode\u003eprotoPayload\u003c/code\u003e field.\u003c/p\u003e\n"]]],[],null,["# Introduction to audit logs in BigQuery\n======================================\n\nLogs are text records that are generated in response to particular events or\nactions. For instance, BigQuery creates log entries for actions such as creating\nor deleting a table, purchasing slots, or running a load job.\n\nGoogle Cloud also writes logs, including audit logs that provide insight into\noperational concerns related to your use of Google Cloud services. For more information about how Google Cloud handles logging, see the [Cloud Logging](/logging/docs) documentation and [Cloud Audit Logs overview](/logging/docs/audit).\n\nAudit logs versus `INFORMATION_SCHEMA` views\n--------------------------------------------\n\nYour Google Cloud projects contain audit logs only for the resources that are directly within the Google Cloud project. Other Google Cloud resources, such as folders, organizations, and billing accounts, contain their own audit logs.\n\nAudit logs help you answer the question \"Who did what, where, and when?\" within your Google Cloud resources. Audit logs are the definitive source of information for system activity by user and access patterns and should be your primary source for audit or security questions.\n\n[`INFORMATION_SCHEMA`](/bigquery/docs/information-schema-intro) views in BigQuery are another source of insights that you can use along with metrics and logs. These views contain metadata about jobs, datasets, tables, and other BigQuery entities. For example, you can get real-time metadata about which BigQuery jobs ran during a specified time. Then, you can group or filter the results by project, user, tables referenced, and other dimensions.\n\n`INFORMATION_SCHEMA` views provide you information to perform a more detailed analysis about your BigQuery workloads, such as the following:\n\n- What is the average slot utilization for all queries over the past seven days for a given project?\n- What streaming errors occurred in the past 30 minutes, grouped by error code?\n\nBigQuery audit logs contain log entries for API calls, but they\ndon't describe the impact of the API calls. A subset of API calls creates jobs\n(such as query and load) whose information is captured by `INFORMATION_SCHEMA`\nviews. For example, you can find information about the time and slots that are\nutilized by a specific query in `INFORMATION_SCHEMA` views but not in the audit logs.\n\nTo get insights into the performance of your BigQuery workloads in particular, see [jobs metadata](/bigquery/docs/information-schema-jobs), [streaming metadata](/bigquery/docs/information-schema-streaming), and [reservations metadata](/bigquery/docs/information-schema-reservations).\n\nFor more information about the types of audit logs that Google Cloud services write, see\n[Types of audit logs](/logging/docs/audit#types).\n\nAudit log format\n----------------\n\nGoogle Cloud services write audit logs in a structured JSON format. The base data type for Google Cloud log entries is the [`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry) structure. This structure contains the name of the log, the resource that generated the log entry, the timestamp (UTC), and other basic information.\n\nLogs include details of the logged event in a subfield that's called the *payload field* . For audit logs, the payload field is named `protoPayload`. This field's type (`protoPayload.@type`) is set to `type.googleapis.com/google.cloud.audit.AuditLog`, which indicates that the field uses the [`AuditLog`](/logging/docs/reference/audit/auditlog/rest/Shared.Types/AuditLog) log structure.\n\nFor operations on datasets, tables, and jobs, BigQuery writes audit logs in two different formats, although both formats share the `AuditLog` base type.\n\nThe older format includes the following fields and values:\n\n- The value for the `resource.type` field is `bigquery_resource`.\n- BigQuery writes the details about an operation in the `protoPayload.serviceData` field. The value of this field uses the [`AuditData`](/bigquery/docs/reference/auditlogs/rest/Shared.Types/AuditData) log structure.\n\nThe newer format includes the following fields and values:\n\n- The value for the `resource.type` field is either `bigquery_project` or `bigquery_dataset`. The `bigquery_project` resource has log entries about jobs, while the `bigquery_dataset` resource has log entries about storage.\n- BigQuery writes the details about an operation in the `protoPayload.metadata` field. The value of this field uses the [`BigQueryAuditMetadata`](/bigquery/docs/reference/auditlogs/rest/Shared.Types/BigQueryAuditMetadata) structure.\n\nWe recommend consuming logs in the newer format. For more information, see [Audit logs migration guide](/bigquery/docs/reference/auditlogs/migration).\n\nThe following is an abbreviated example of a log entry that shows a failed operation: \n\n {\n \"protoPayload\": {\n \"@type\": \"type.googleapis.com/google.cloud.audit.AuditLog\",\n \"status\": {\n \"code\": 5,\n \"message\": \"Not found: Dataset myproject:mydataset was not found in location US\"\n },\n \"authenticationInfo\": { ... },\n \"requestMetadata\": { ... },\n \"serviceName\": \"bigquery.googleapis.com\",\n \"methodName\": \"google.cloud.bigquery.v2.JobService.InsertJob\",\n \"metadata\": {\n },\n \"resource\": {\n \"type\": \"bigquery_project\",\n \"labels\": { .. },\n },\n \"severity\": \"ERROR\",\n \"logName\": \"projects/myproject/logs/cloudaudit.googleapis.com%2Fdata_access\",\n ...\n }\n\nFor operations on BigQuery reservations, the `protoPayload` field uses the `AuditLog` structure, and the `protoPayload.request` and `protoPayload.response` fields contain more information. You can find the field definitions in [BigQuery Reservation API](/bigquery/docs/reference/reservations/rpc). For more information, see [Monitoring BigQuery reservations](/bigquery/docs/reservations-monitoring).\n\nFor a deeper understanding of the audit log format, see [Understand audit logs](/logging/docs/audit/understanding-audit-logs).\n\nLimitations\n-----------\n\nLog messages have a size limit of 100,000 bytes. For more information, see [Truncated log entry](/bigquery/docs/reference/auditlogs#truncated_log_entry).\n\nVisibility and access control\n-----------------------------\n\nBigQuery audit logs can include information that users might consider sensitive, such as SQL text, schema definitions, and identifiers for resources such as tables and datasets. For information about managing access to this information, see the Cloud Logging [access control documentation](/logging/docs/access-control).\n\nWhat's next\n-----------\n\n- To learn how to use Cloud Logging to audit activities that are related to policy tags, see [Audit policy tags](/bigquery/docs/auditing-policy-tags).\n- To learn how to use BigQuery to analyze logged activity, see [BigQuery audit logs overview](/bigquery/docs/reference/auditlogs)."]]