[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-07-24。"],[],[],null,["# Analyze log volume with Log Analytics\n\n| **Preview**\n|\n|\n| This product or feature is subject to the \"Pre-GA Offerings Terms\" in the General Service Terms section\n| of the [Service Specific Terms](/terms/service-terms#1).\n|\n| Pre-GA products and features are available \"as is\" and might have limited support.\n|\n| For more information, see the\n| [launch stage descriptions](/products#product-launch-stages).\n\nThis document describes how you can use Log Analytics to estimate\nthe billable volume of your log entries. You can write queries that report and\naggregate your billable volume by different dimensions, like resource type or\napplication name, and then chart and view the query results.\n\nHow to query for billable volume\n--------------------------------\n\nThe billable volume of a log entry, which is the size that is\nreported to Cloud Billing, is available through the `storage_bytes` field.\nIn your queries, you can use the `storage_bytes` field in the same way\nthat you use any schema field whose data type is `INTEGER`.\nFor example, you can include it in `SELECT` clauses, in\n`CASE` statements, and in common table expressions.\nFor more information about querying your\nlogs, see the following documents:\n\n- [Query and view logs in Log Analytics](/logging/docs/analyze/query-and-view)\n- [Sample SQL queries](/logging/docs/analyze/examples)\n\nBecause Cloud Billing uses the billable volume when determining your costs,\nyou can write queries that help you understand the sources of your costs.\nFor example, you can write queries that help you determine which applications\nare writing the most log entries. To learn how to relate billable volume to\ncost, see the Cloud Logging sections of the [Google Cloud Observability pricing](https://cloud.google.com/stackdriver/pricing) page.\n\nThe billable volume of a log entry isn't the size of the [`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry)\nobject that was sent to the Cloud Logging API. The billable volume includes\nbytes that are required for serialization and metadata.\n\nBefore you begin\n----------------\n\nThis section describes steps that you must complete before you\ncan use Log Analytics.\n\n### Configure log buckets\n\nEnsure that your log buckets have been upgraded to use Log Analytics:\n\n1. In the Google Cloud console, go to the **Logs Storage** page:\n\n [Go to **Logs Storage**](https://console.cloud.google.com/logs/storage)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n2. For each log bucket that has a log view that you want to query, ensure that the **Log Analytics available** column displays **Open** . If **Upgrade** is shown, then click **Upgrade** and complete the dialog.\n\n### Configure IAM roles and permissions\n\nThis section describes the IAM roles or permissions that\nare required to use Log Analytics:\n\n-\n\n To get the permissions that\n you need to use Log Analytics and query log views,\n\n ask your administrator to grant you the\n following IAM roles on your project:\n\n - To query the `_Required` and `_Default` log buckets: [Logs Viewer](/iam/docs/roles-permissions/logging#logging.viewer) (`roles/logging.viewer`)\n - To query all log views in a project: [Logs View Accessor](/iam/docs/roles-permissions/logging#logging.viewAccessor) (`roles/logging.viewAccessor`)\n\n\n You can restrict a principal to a specific log view either by adding an\n IAM condition to the Logs View Accessor role grant made at\n the project level, or by adding an IAM binding to the\n policy file of the log view. For more information, see\n [Control access to a log view](/logging/docs/logs-views#about-iam-policies).\n\n These are the same permissions that you need to view log entries\n on the **Logs Explorer** page. For information about\n additional roles that you need to query views on user-defined buckets or\n to query the `_AllLogs` view of the `_Default`\n log bucket, see\n [Cloud Logging roles](/logging/docs/access-control#considerations).\n-\n\n To get the permissions that\n you need to query analytics views,\n\n ask your administrator to grant you the\n\n\n [Observability Analytics User](/iam/docs/roles-permissions/observability#observability.analyticsUser) (`roles/observability.analyticsUser`)\n IAM role on your project.\n\n\nSample queries\n--------------\n\nThis section provides example queries that analyze data from a single log view.\nIf you store data in multiple log views and if you want to compute aggregate\nvalues for data stored in those views, then you need to use the `UNION`\nstatement.\n\nYou can query your log entry by using the\n**Log Analytics** page or anywhere you can query BigQuery datasets,\nwhich includes the **BigQuery Studio** and **Looker Studio** pages, and the\n[bq command-line tool](/bigquery/docs/reference/bq-cli-reference).\n\nTo use the sample queries, do the following:\n\n- **Log Analytics** page: Replace \u003cvar translate=\"no\"\u003eTABLE_NAME_OF_LOG_VIEW\u003c/var\u003e\n with the name of the log view. The format is\n `project_ID.region.bucket_ID.view_ID`.\n\n [Go to **Log Analytics**](https://console.cloud.google.com/logs/analytics)\n- BigQuery datasets: Replace\n \u003cvar translate=\"no\"\u003eTABLE_NAME_OF_LOG_VIEW\u003c/var\u003e with the [path to the table](/bigquery/docs/reference/standard-sql/data-definition-language#table_path) in the\n linked dataset.\n\n [Go to **BigQuery Studio**](https://console.cloud.google.com/bigquery)\n\n### Query for log volume by app\n\nTo compute the total bytes per day, per app, for your log entries that were\nwritten against a Google Kubernetes Engine resource and that have a JSON payload,\nuse the following query: \n\n SELECT\n timestamp_trunc(timestamp,DAY) as day,\n JSON_VALUE(labels[\"k8s-pod/app\"]) as app_id,\n SUM(storage_bytes) as total_bytes\n FROM\n `\u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eTABLE_NAME_OF_LOG_VIEW\u003c/span\u003e\u003c/var\u003e`\n WHERE\n json_payload IS NOT NULL\n AND resource.type=\"k8s_container\"\n GROUP BY ALL\n\nTo visualize the data, you can create a chart.\n| **Note:** The previous query aggregates the data over one-day intervals. To create a chart which displays data for multiple days, you might need to modify the time-range selector. In the following example, the query evaluates data over a time range of `14 days`.\n\nIn the following example,\nthe data is displayed as a stacked bar chart. Each bar on the chart displays the\ntotal number of bytes stored, organized by app.\nIn this example, the `frontend` app is generating the most log data:\n\n### Query for log volume by log name\n\nTo list the number of stored bytes and the log name for each\nlog entry that has a JSON payload and that was written against a\nGoogle Kubernetes Engine resource, use the following query; \n\n SELECT\n log_id AS log_name,\n storage_bytes\n FROM\n `\u003cvar translate=\"no\"\u003e\u003cspan class=\"devsite-syntax-n\"\u003eTABLE_NAME_OF_LOG_VIEW\u003c/span\u003e\u003c/var\u003e`\n WHERE\n json_payload IS NOT NULL\n AND resource.type=\"k8s_container\"\n\nThe previous query doesn't aggregate the results, instead there is one row\nfor each log entry, and that row contains a log name and the number of\nstored bytes. If you chart this data, then you can visualize the portion of\nyour log data as written to different logs:\n\nThe previous chart shows that most log data is written to the log named\n`stdout`.\n\n### Use the bq command-line tool to query for log volume by log name\n\nYou can include the `storage_bytes` field in queries that you run through the\n**BigQuery Studio** page or by using the\n[bq command-line tool](/bigquery/docs/reference/bq-cli-reference).\n\nThe following query reports the log name and the number of stored bytes for each\nlog entry: \n\n bq query --use_legacy_sql=false 'SELECT log_id as log_name,\n storage_bytes FROM `\u003cvar translate=\"no\"\u003eTABLE_NAME_OF_LOG_VIEW\u003c/var\u003e`'\n\nThe result of this query is similar to the following: \n\n +----------+---------------+\n | log_name | storage_bytes |\n +----------+---------------+\n | stdout | 716 |\n | stdout | 699 |\n | stdout | 917 |\n | stdout | 704 |\n\nEach row corresponds to one log entry. The value of the `storage_bytes`\ncolumn is the billable volume for that log entry.\n\nLimitations\n-----------\n\nThe `storage_bytes` field is available only when the following are true:\n\n- The log bucket is upgraded to use Log Analytics.\n- Your query is executed on the **Log Analytics** page or anywhere you can\n query BigQuery datasets, which includes the\n **BigQuery Studio** and **Looker Studio** pages, and the\n [bq command-line tool](/bigquery/docs/reference/bq-cli-reference).\n\n- The log entry was written on or after January 1, 2024."]]