This document explains how you can find log entries that you routed from Cloud Logging to Pub/Sub topics, which occurs in near real-time. We recommend using Pub/Sub for integrating Cloud Logging logs with third-party software.
When you route log entries to a Pub/Sub topic, Logging publishes each log entry as a Pub/Sub message as soon as Logging receives that log entry. Routed log entries are generally available within seconds of their arrival to Logging, with 99% of log entries available in less than 60 seconds.
Before you begin
For a conceptual discussion of sinks, see Overview of routing and storage models: Sinks.
For instructions on how to route your log entries, see Route logs to supported destinations.
View logs
To view your logs as they are streamed through Pub/Sub, do the following:
-
In the Google Cloud console, go to the Topics page:
If you use the search bar to find this page, then select the result whose subheading is Pub/Sub.
Find or create a subscription to the topic used in the log sink, and pull a log entry from it. You might have to wait for a new log entry to be published.
Logs organization
The data
field of each message is a base64-encoded LogEntry
object.
As an example, a Pub/Sub subscriber might pull the following
object from a topic that is receiving log entries.
The object shown contains a list with a single message, although
Pub/Sub might return several messages if several log entries are
available.
The data
value (about 600 characters) and the ackId
value
(about 200 characters) have been shortened to make the example easier to read:
{ "receivedMessages": [ { "ackId": "dR1JHlAbEGEIBERNK0EPKVgUWQYyODM...QlVWBwY9HFELH3cOAjYYFlcGICIjIg", "message": { "data": "eyJtZXRhZGF0YSI6eyJzZXZ0eSI6Il...Dk0OTU2G9nIjoiaGVsbG93b3JsZC5sb2cifQ==", "attributes": { "compute.googleapis.com/resource_type": "instance", "compute.googleapis.com/resource_id": "123456" }, "messageId": "43913662360" } } ] }
If you decode the data
field and format it, you get the following
LogEntry
object:
{ "log": "helloworld.log", "insertId": "2015-04-15|11:41:00.577447-07|10.52.166.198|-1694494956", "textPayload": "Wed Apr 15 20:40:51 CEST 2015 Hello, world!", "timestamp": "2015-04-15T18:40:56Z", "labels": { "compute.googleapis.com\/resource_type": "instance", "compute.googleapis.com\/resource_id": "123456" }, "severity": "WARNING" } }
Third-party integration with Pub/Sub
You route your log entries to a Pub/Sub topic. The third party receives your log entries by subscribing to the same topic. Logging supports logging integration with third parties, such as Splunk or Datadog. For a current list of integrations, see Partners for Google Cloud Observability integrations.
To perform the integration, expect to do something like the following:
In your project where your log entries originate, create your Pub/Sub topic with a default subscription:
Enable the Pub/Sub API.
-
In the Google Cloud console, go to the Topics page:
If you use the search bar to find this page, then select the result whose subheading is Pub/Sub.
Click Create topic.
In the Topic ID field, enter an ID for your topic. For example,
projects/my-project-id/topics/my-pubsub-topic
.Each message sent to the topic include the timestamp of the routed log entry in the Pub/Sub message
attributes
; for example:"attributes": { "logging.googleapis.com/timestamp": "2024-07-01T00:00:00Z" }
Retain the option Add a default subscription. Don't select any other option.
Click Create topic.
In your project where your log entries originate, configure Logging to route log entries to your topic:
-
In the Google Cloud console, go to the Log Router page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
- Click Create Sink, enter a name and description for the sink, and then click Next.
- In the Sink Service menu, select Cloud Pub/Sub topic, select the Pub/Sub topic, and then click Next.
- Select the log entries to include in the sink and then click Next.
- Optional: Select the log entries to exclude.
Click Create Sink.
A dialog with the message Sink created appears. This message indicates that your sink was successfully created with permissions to route future matching log entries to the destination you selected.
Grant the role of Pub/Sub Publisher (
roles/pubsub.publisher
) to the writer identity of the sink. For more information about obtaining the writer identity and granting a role, see Set destination permissions.
Cloud Logging is now sending log entries to your Pub/Sub topic.
-
Create the subscription.
For example, if you use Dataflow to pull the data from your Pub/Sub topic and send it to Datadog, then you need to perform two steps:
Create, or obtain, a service account, and then grant it the IAM roles necessary to subscribe to your topic. At a minimum, the service account requires the following roles:
- Pub/Sub Subscriber (
roles/pubsub.subscriber
) - Dataflow Admin (
roles/dataflow.admin
) - Dataflow Worker (
roles/dataflow.worker
)
For more information, see the following documents:
- Pub/Sub Subscriber (
Create a job from a template, and then run that job. For this example, you would use the Pub/Sub to Datadog template.
Your third party should begin receiving the log entries right away.
For an exploration of common logs routing scenarios using Pub/Sub, see Scenarios for exporting Cloud Logging data: Splunk.
Troubleshooting
If log entries seem to be missing from your sink's destination or you otherwise suspect that your sink isn't properly routing log entries, then see Troubleshoot routing logs.
Pricing
Cloud Logging doesn't charge to route logs to a
supported destination; however, the destination might apply charges.
With the exception of the _Required
log bucket,
Cloud Logging charges to stream logs into log buckets and
for storage longer than the default retention period of the log bucket.
Cloud Logging doesn't charge for copying logs, for defining log scopes, or for queries issued through the Logs Explorer or Log Analytics pages.
For more information, see the following documents:
- Cloud Logging pricing summary
Destination costs:
- VPC flow log generation charges apply when you send and then exclude your Virtual Private Cloud flow logs from Cloud Logging.