This document describes how to query and analyze the log data stored in log buckets that have been upgraded to use Log Analytics. You can query logs in these buckets by using SQL, which lets you filter and aggregate your logs. To view your query results, you can use the tabular form, or you can visualize the data with charts. These tables and charts can be saved to your custom dashboards.
You can query a log view on a log bucket.
When you query a log view, the
schema corresponds to that of the LogEntry
data structure.
You can use the Logs Explorer to view log entries stored in log buckets in your project, whether or not the log bucket has been upgraded to use Log Analytics.
Log Analytics doesn't deduplicate log entries, which might affect how you write your queries. Also, there are some restrictions when using Log Analytics. For more information about these topics, see the following documents:
- Troubleshoot: There are duplicate log entries in my Log Analytics results.
- Log Analytics: Restrictions.
About linked datasets
Log Analytics supports the creation of linked BigQuery datasets, which let BigQuery have read access to the underlying data. If you choose to create a linked dataset, then you can do the following:
- Join log entry data with other BigQuery datasets.
- Query log data from another service like the BigQuery Studio page or Looker Studio.
- Improve the performance of the queries that you run from the Log Analytics by running them on your BigQuery reserved slots.
- Create an alerting policy that monitors the result of a SQL query. For more information, see Monitor your SQL query results with an alerting policy.
This document doesn't describe how to create a linked dataset or how to configure the Log Analytics to run queries on reserved slots. If you are interested in those topics, then see Query a linked dataset in BigQuery.
Before you begin
This section describes steps that you must complete before you can use Log Analytics.
Configure log buckets
Ensure that your log buckets have been upgraded to use Log Analytics:
-
In the Google Cloud console, go to the Logs Storage page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- For each log bucket that has a log view that you want to query, ensure that the Log Analytics available column displays Open. If Upgrade is shown, then click Upgrade and complete the dialog.
Configure IAM roles and permissions
This section describes the IAM roles or permissions that are required to use Log Analytics:
-
To get the permissions that you need to use Log Analytics and query log views, ask your administrator to grant you the following IAM roles on your project:
-
To query the
_Required
and_Default
log buckets: Logs Viewer (roles/logging.viewer
) -
To query all log views in a project:
Logs View Accessor (
roles/logging.viewAccessor
)
You can restrict a principal to a specific log view either by adding an IAM condition to the Logs View Accessor role grant made at the project level, or by adding an IAM binding to the policy file of the log view. For more information, see Control access to a log view.
These are the same permissions that you need to view log entries on the Logs Explorer page. For information about additional roles that you need to query views on user-defined buckets or to query the
_AllLogs
view of the_Default
log bucket, see Cloud Logging roles. -
To query the
Query a log view
When you are troubleshooting a problem, you might want to count the log entries with a field that match a pattern or compute average latency for HTTP request. You can perform these actions by running a SQL query on a log view.
To issue a SQL query to a log view, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the Log views list, find the view, and then select Query. The Query pane is populated with a default query, which includes the table name for the log view name that is queried. This name has the format
project_ID.region.bucket_ID.view_ID
.You can also enter a query in the Query pane, or edit a displayed query. For example queries, see Sample queries.
To specify a time range, we recommend that you use the time-range selector. However, you can add a
WHERE
clause that specifies thetimestamp
field. When a query includes atimestamp
field, that timestamp overrides the selected time range in the time-range selector and the time-range selector is disabled.In the toolbar, ensure that a button labeled Run query is displayed.
If the toolbar displays Run in BigQuery, then click settings Settings and select Log Analytics (default).
Run your query.
The query is executed and the result of the query is shown in the Results tab.
You can use the toolbar options to format your query, clear the query, and open the BigQuery SQL reference documentation.
Optional: Create a chart or save results to a custom dashboard.
By default, your query results are presented as a table. However, you can create a chart, and you can also save the table or chart to a custom dashboard.
For information about how to create and configure a chart, and how to save a query result to a dashboard, see Chart SQL query results.
Display the schema of a log view
The schema of a log view defines its structure and the data type
for each field. This information is important to you because it determines
how you construct your queries. For example, suppose you want to compute the
average latency of HTTP requests. You need to know how to access the latency
field and whether it is stored as an integer like 100
or stored as a
string like "100"
. When the latency data is stored as a string, the query
must cast the value to a numeric value before computing an average.
When the data type of a column is JSON, the schema doesn't list the fields
available for that column. For example, a log entry can have a
field with the name of json_payload
. When a log bucket is upgraded to use
Log Analytics, that field is mapped to a column with a data type of JSON.
The schema doesn't indicate the child fields of the column. That is, you
can't use the schema to determine if json_payload.url
is a valid reference.
To identify the schema for a log view, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the Log views list, find the log view, and then select the name of the log view.
The schema is displayed. You can use the Filter field to locate specific fields. You can't modify the schema.
What's next
- Save and share a SQL query
- Chart SQL query results
- Sample SQL queries
- Query a linked dataset in BigQuery