Using Flow Analyzer, you can perform the following tasks:
- Build and run a simple query on VPC Flow Logs
- Build a SQL filter (using a WHERE statement) for the query on VPC Flow Logs
- Organize the results using selected fields and sort the query results using the total traffic and aggregate packets
- View the traffic at chosen time intervals
- View the top five highest traffic flows over time in a graphical format, when compared with the rest of the traffic
- View the resources with highest traffic aggregated over the selected duration in a tabular format
- View the details of the traffic between a specific source and destination pair from the query results
- Drill down the query results using the remaining fields available in VPC Flow Logs
Before you begin
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Required roles and permissions
Because Flow Analyzer reads data on behalf of the user, ensure that you have sufficient permissions to read the bucket containing the logs. The bucket must also be upgraded to use Log Analytics.
To let a user read logs in the buckets, use the Logs Explorer page. Use the Log Analytics page to grant one of the following roles:
- For access to the
_Default
view on the_Default
bucket, grant the Logs Viewer role (roles/logging.viewer
). - For access to all logs in the
_Default
log bucket, including data access logs, grant the Private Logs Viewer role (roles/logging.privateLogViewer
).
For more information, see Logging roles.
- For access to the
To let a user read logs that are stored in a user-defined bucket, grant the Logs View Accessor role (
roles/logging.viewAccessor
). You can restrict authorization to a specific log view. For more information, see Control access to a log view.Alternatively, create a custom role that grants the following permissions:
logging.buckets.get
logging.buckets.list
logging.logEntries.list
logging.logs.list
resourcemanager.projects.get
Build and run a query
To build and run a query using basic filters, do the following:
Console
In the Google Cloud console, go to the Flow Analyzer page.
Click
Source bucket and do the following:- In the Log bucket field, select the log bucket containing the flow logs that you want to query. By default, flow logs are stored in the _Default log bucket.
- In the Log bucket view field, select a log view.
- Optional: If you want to query flow logs associated with a specific
VPC Flow Logs configuration, do the following:
- Select the Select specific configuration checkbox.
- In the Log configurations list, select one or more VPC Flow Logs configurations. The Flow logs configured for subnetworks option selects all flow logs for all subnets in the log bucket.
In the Traffic aggregation menu, select one of the following options:
- Source - Destination: aggregate the traffic from the source to the destination.
- Client - Server: aggregate the traffic in both directions by considering the resources with lower port numbers and service definitions or having GKE service properties as servers.
For more information, see Traffic aggregation.
In the time-range selector, set the time range of your query. The default time range is one hour. You can select a preset time range, specify a custom start and end time, or select a time range around a specific time.
In the Filter lists, select one or more query filters. Each filter corresponds to a VPC Flow Logs field. For more information about these fields, see Record format. If you don't select any filters, Flow Analyzer shows the query results for all flows within the selected period of time.
If you select more than one value for the same filter, an
OR
operator is used. If you select more than one filter in the same Filter list, anAND
operator is used. For example, if you select two IP address values—10.10.0.10
and10.10.0.20
—and two Country values—usa
andfra
—the following filter logic is applied to the query:(IP=10.10.0.10 OR IP=10.10.0.20) AND (Country=usa OR Country=fra)
.Select how to organize your query results by using the Organize flows by lists or leave the default values.
Click Run new query.
The Highest data flows chart and the All data flows table are updated.
You can use the Display options panel to customize your query results. For more information, see Display options. To select custom options, see Customize display options.
Build and run a SQL query
To build and run a query in Flow Analyzer using SQL filters, do the following:
Console
In the Google Cloud console, go to the Flow Analyzer page.
Select a log bucket. If you plan to use the _Default log bucket, you can skip this step.
To set the time range of your query, use the time-range selector or select Re-run selected period.
In the Traffic aggregation menu, select one of the following options:
- Source - Destination: aggregate the traffic from the source to the destination.
- Client - Server: aggregate the traffic in both directions by considering the resources with lower port numbers and service definitions as servers.
For more information, see Traffic aggregation.
Click SQL Filters.
Enter the SQL filter query using BigQuery SQL syntax.
To view filter expression syntax and examples, click Filter expression syntax and examples.
Organize flow by using the fields. Select a field to organize the flow details.
Click Run new query.
The Highest data flows chart and the All data flows table are updated.
You can use the Display options panel to customize your query results. For more information, see Display options. To select custom options, see Customize display options.
Customize display options
To view specific details of the traffic flows, you can customize the display options. For more information about the various display options available, see Display options.
Console
- Build the query.
- Select a log bucket. If you plan to use the _Default log bucket, you can skip this step.
- To set the time range of your query, use the time-range selector or select Re-run selected period.
- Select the filters.
- Select the fields to organize the results.
- Run the query.
- Select the metric type: Bytes sent or Packets sent.
Select a metric aggregation option.
If you select Bytes sent as the metric, choose one of the following options:
- Total traffic: the total traffic for the chosen time period. Enabled by default.
- Average traffic rate: the average traffic rate for the chosen time period. Calculated only for the alignment periods during which the traffic was observed.
- Median traffic rate: the median traffic rate for the chosen time period. Calculated only for the alignment periods during which the traffic was observed.
- P95 traffic rate: the 95th percentile traffic rate for the chosen time period. Calculated only for the alignment periods during which the traffic was observed.
- Maximum traffic rate: the maximum traffic rate for the chosen time period.
If you select Packets sent as the metric, choose one of the following options:
- Aggregate packets: the aggregate number of packets for the chosen time period. Enabled by default.
- Average packets rate: the average packet rate for the chosen time period. Calculated only for the alignment periods during which the traffic was observed.
- Median packets rate: the median packet rate for the chosen time period. Calculated only for the alignment periods during which the traffic was observed.
- P95 packets rate: the 95th percentile packet rate for the chosen time period. Calculated only for the alignment periods during which the traffic was observed.
- Maximum packets rate: the maximum packet rate for the chosen time period.
For more information about the various metric aggregation options, see Metric aggregations.
Select the Alignment period. For more information about the alignment period, see Alignment period.
Choose a sampling point.
- Source endpoint: the number of bytes sent or packets sent as reported at the source endpoint of a flow.
- Destination endpoint: the number of bytes sent or packets sent as reported at the destination endpoint of a flow.
- Sum of source and destination endpoint: the sum of bytes sent or packets sent as reported by both endpoints of a flow.
- Average of source and destination endpoint: an average of bytes sent or packets sent as reported by both endpoints of a flow if both source and destination details are available in VPC Flow Logs.
For more information, see Sampling point.
View flow details
To view flow details for a selected flow in the data flows table, do the following:
Console
- Build the query.
- Select a log bucket. If you plan to use the _Default log bucket, you can skip this step.
- To set the time range of your query, use the time-range selector or select Re-run selected period.
- Select the filters.
- Select the fields to organize the results.
- Run the query.
- In the All data flows table, click Details for any flow. The Flow details page that appears shows all the resources matching the selected filters and the traffic of these resources.
Drill down traffic flows
You can further refine the traffic of the selected resources. Using Flow Analyzer, you can drill down into the query results by using the remaining fields available in VPC Flow Logs. For more information, see View flow details.
To drill down traffic flows using more fields, do the following:
Console
- Build the query.
- Select a log bucket. If you plan to use the _Default log bucket, you can skip this step.
- To set the time range of your query, use the time-range selector or select Re-run selected period.
- Select the filters.
- Select the fields to organize the results.
- Run the query.
In the All data flows table, click Details for any flow.
The Flow details page that appears shows all the resources matching the selected filters and the traffic of these resources.
In the Drill down by list, select a field to do a drill down.
To compare with past traffic, click the Compare to past toggle. This feature lets you view six lines: three solid lines for the three top traffic flows from the drill down and three dashed lines in corresponding colors representing the past traffic.
What's next
- Metrics and alignment period
- Run Connectivity Tests
- Monitor your traffic flows
- Troubleshoot data issues in Flow Analyzer