Query and analyze logs with Log Analytics

This document describes how to query and analyze the log data stored in log buckets that have been upgraded to use Log Analytics. You can query logs in these buckets by using SQL, which lets you filter and aggregate your logs. To view your query results, you can use the tabular form, or you can visualize the data with charts. These tables and charts can be saved to your custom dashboards.

You can query either a log view on a log bucket or an analytics view. When you query a log view, the schema corresponds to that of the LogEntry data structure. Because the creator of an analytics view determines the schema, one use case for analytics views is to transform log data from the LogEntry format into a format that is more suitable for you.

You can use the Logs Explorer to view log entries stored in log buckets in your project, whether or not the log bucket has been upgraded to use Log Analytics.

Log Analytics doesn't deduplicate log entries, which might affect how you write your queries. Also, there are some restrictions when using Log Analytics. For more information about these topics, see the following documents:

About linked datasets

Log Analytics supports the creation of linked BigQuery datasets, which let BigQuery have read access to the underlying data. If you choose to create a linked dataset, then you can do the following:

This document doesn't describe how to create a linked dataset or how to configure the Log Analytics to run queries on reserved slots. If you are interested in those topics, then see Query a linked dataset in BigQuery.

Before you begin

This section describes steps that you must complete before you can use Log Analytics.

Configure log buckets

Ensure that your log buckets have been upgraded to use Log Analytics:

  1. In the Google Cloud console, go to the Logs Storage page:

    Go to Logs Storage

    If you use the search bar to find this page, then select the result whose subheading is Logging.

  2. For each log bucket that has a log view that you want to query, ensure that the Log Analytics available column displays Open. If Upgrade is shown, then click Upgrade and complete the dialog.

Configure IAM roles and permissions

This section describes the IAM roles or permissions that are required to use Log Analytics:

  • To get the permissions that you need to use Log Analytics and query log views, ask your administrator to grant you the following IAM roles on your project:

    • To query the _Required and _Default log buckets: Logs Viewer (roles/logging.viewer)
    • To query all log views in a project: Logs View Accessor (roles/logging.viewAccessor)

    You can restrict a principal to a specific log view either by adding an IAM condition to the Logs View Accessor role grant made at the project level, or by adding an IAM binding to the policy file of the log view. For more information, see Control access to a log view.

    These are the same permissions that you need to view log entries on the Logs Explorer page. For information about additional roles that you need to query views on user-defined buckets or to query the _AllLogs view of the _Default log bucket, see Cloud Logging roles.

  • To get the permissions that you need to query analytics views, ask your administrator to grant you the Observability Analytics User (roles/observability.analyticsUser) IAM role on your project.

Query a log view or an analytics view

When you are troubleshooting a problem, you might want to count the log entries with a field that match a pattern or compute average latency for HTTP request. You can perform these actions by running a SQL query on a log view.

To issue a SQL query to a log view, do the following:

  1. In the Google Cloud console, go to the Log Analytics page:

    Go to Log Analytics

    If you use the search bar to find this page, then select the result whose subheading is Logging.

  2. If you want to load the default query, then do the following:

    1. In the Views menu, go to the Logs or Analytics Views section, and select the view that you want to query.

      To find a view, you can use the Filter bar or you can scroll through the list:

      • Log views are listed by BUCKET_ID.LOG_VIEW_ID, where these fields refer to the IDs of the log bucket and log view.

      • Analytics views are listed by LOCATION.ANALYTICS_VIEW_ID, where these fields refer to the location and ID of an analytics view.

    2. In the Schema toolbar, click Query.

      The Query pane is updated with a SQL query that is querying the analytics view you selected.

  3. If you want enter a query, then do the following:

    • To specify a time range, we recommend that you use the time-range selector. If you can add a WHERE clause that specifies the timestamp field, then that value overrides the setting in the time-range selector and that selector is disabled.

    • For examples, see Sample queries.

    • To query a log view, the FROM clause for your query should have the following format:

      FROM `PROJECT_ID.LOCATION.BUCKET_ID.LOG_VIEW_ID`
      
    • To query an analytics view, the FROM clause for your query should have the following format:

      FROM `analytics_view.PROJECT_ID.LOCATION.ANALYTICS_VIEW_ID`
      

    The following describes the meaning of the fields in the previous expressions:

    • PROJECT_ID: The identifier of the project.
    • LOCATION: The location of the log view or the analytics view.
    • BUCKET_ID: The name or ID of the log bucket.
    • LOG_VIEW_ID: The identifier of the log view.
    • ANALYTICS_VIEW_ID: The ID of the analytics view.
  4. In the toolbar, ensure that a button labeled Run query is displayed.

    If the toolbar displays Run in BigQuery, then click Settings and select Log Analytics (default).

  5. Run your query.

    The query is executed and the result of the query is shown in the Results tab.

    You can use the toolbar options to format your query, clear the query, and open the BigQuery SQL reference documentation.

  6. Optional: Create a chart or save results to a custom dashboard.

    By default, your query results are presented as a table. However, you can create a chart, and you can also save the table or chart to a custom dashboard.

    For information about how to create and configure a chart, and how to save a query result to a dashboard, see Chart SQL query results.

Display the schema

The schema defines its structure and the data type for each field. This information is important to you because it determines how you construct your queries. For example, suppose you want to compute the average latency of HTTP requests. You need to know how to access the latency field and whether it is stored as an integer like 100 or stored as a string like "100". When the latency data is stored as a string, the query must cast the value to a numeric value before computing an average.

When the data type of a column is JSON, the schema doesn't list the fields available for that column. For example, a log entry can have a field with the name of json_payload. When a log bucket is upgraded to use Log Analytics, that field is mapped to a column with a data type of JSON. The schema doesn't indicate the child fields of the column. That is, you can't use the schema to determine if json_payload.url is a valid reference.

To identify the schema, do the following:

  1. In the Google Cloud console, go to the Log Analytics page:

    Go to Log Analytics

    If you use the search bar to find this page, then select the result whose subheading is Logging.

  2. In the Views pane, find the log view or analytics view, and then select the view.

    The schema is displayed. For log views, the scheme is fixed and corresponds to the LogEntry data structure. For analytics views, you can modify the SQL query to change the schema.

What's next