Cloud Logging allows you to store, search, analyze, monitor and alert on log data and events from Google Cloud and Amazon Web Services. This page shows how to manage Logging using PowerShell. It walks through a simple example of creating logs, log sinks and log metrics.
Read the Cloud Tools for PowerShell cmdlet reference to learn more about Logging cmdlets. To learn more about Logging in general, read the Overview of Logging guide.
Creating logs and log entries
A log is a named collection of log entries within the project. A log entry records status or an event. The entry might be created by Google Cloud services, AWS services, third party applications, or your own applications. The "message" the log entry carries is called the payload, and it can be a simple string or structured data. Each log entry indicates where it came from by including the name of a monitored resource.
The cmdlet New‑GcLogEntry
can be used to create a log entry. You will have to
specify the log that the entry belongs to (if the log does not exist, it will be
created). To associate the log with a monitored resource, you can use the
-MonitoredResource
parameter. By default, the log entry is associated with the
"global" resource. To create a monitored resource, use the
New‑GcLogMonitoredResource
cmdlet.
# Creates a log entry in the log "my-log". New-GcLogEntry -LogName "my-log" -TextPayload "This is a log entry." # Creates a log entry associated with a Cloud SQL monitored resource $resource = New-GcLogMonitoredResource -ResourceType "cloudsql_database" ` -Labels @{"project_id" = "my-project"; "database_id" = "id"} New-GcLogEntry -LogName "my-log" ` -TextPayload "This is a log entry." ` -MonitoredResource $resource
You can retrieve log entries with the cmdlet Get‑GcLogEntry.
# Gets all entries from log "my-log" Get-GcLogEntry -LogName "my-log" # Gets all entries associated with Compute Engine instances Get-GcLogEntry -ResourceName "gce_instance"
Creating log sinks
To export log entries, you can create log sinks with the cmdlet New‑GcLogSink
.
Stackdriver Logging will match incoming log entries against your sinks and all
log entries matching each sink are then copied to the associated destination.
Log entries that exist before the sink is created will not be exported.
Destinations for exported logs can be Cloud Storage Buckets, BigQuery Datasets or Pub/Sub Topics.
# Creates a log sink for log entries in the default project. # The entries will be sent to the Cloud Storage bucket "my-bucket" New-GcLogSink -Sink "my-sink" -GcsBucketDestination "my-bucket" # Creates a log sink for log entries in log "my-log". # The entries will be sent to the BigQuery data set "my_dataset" New-GcLogSink -Sink "my-sink" ` -LogName "my-log" ` -BigQueryDataSetDestination "my_dataset" # Creates a log sink for log entries that match the filter. # The entries will be sent to the Pub/Sub topic "my-topic". New-GcLogSink -Sink "my-sink" ` -Filter "textPayload = `"Testing`"" ` -PubSubTopicDestination "my-topic"
Creating log metrics
You can create log metrics that count the number of log entries that match
certain criteria with the cmdlet New‑GcLogMetric
. These metrics can be used to
create charts and alerting policies in Stackdriver Monitoring.
# Creates a metric for entries in log "my-log". New-GcLogMetric -Metric "my-metric" -LogName "my-log" # Creates a metric for entries associated with Compute Engine instances New-GcLogMetric -Metric "my-metric" -ResourceType "gce_instance" # Creates a metric for entries that match the filter. New-GcLogMetric -Metric "my-metric" -Filter "textPayload = `"Testing`""