View logs routed to Pub/Sub

This document explains how you can find log entries that you routed from Cloud Logging to Pub/Sub topics, which occurs in near real-time. We recommend using Pub/Sub for integrating Cloud Logging logs with third-party software.

When you route logs to a Pub/Sub topic, Logging publishes each log entry as a Pub/Sub message as soon as Logging receives that log entry. Routed logs are generally available within seconds of their arrival to Logging, with 99% of logs available in less than 60 seconds.

Before you begin

For a conceptual discussion of sinks, see Overview of routing and storage models: Sinks.

For instructions on how to route your logs, see Route logs to supported destinations.

View logs

To view your logs as they are streamed through Pub/Sub, do the following:

  1. In the navigation panel of the Google Cloud console, select Pub/Sub, and then select Topics:

    Go to Topics

  2. Find or create a subscription to the topic used in the log sink, and pull a log entry from it. You might have to wait for a new log entry to be published.

Logs organization

The data field of each message is a base64-encoded LogEntry object. As an example, a Pub/Sub subscriber might pull the following object from a topic that is receiving log entries. The object shown contains a list with a single message, although Pub/Sub might return several messages if several log entries are available. The data value (about 600 characters) and the ackId value (about 200 characters) have been shortened to make the example easier to read:

{
 "receivedMessages": [
  {
   "ackId": "dR1JHlAbEGEIBERNK0EPKVgUWQYyODM...QlVWBwY9HFELH3cOAjYYFlcGICIjIg",
   "message": {
    "data": "eyJtZXRhZGF0YSI6eyJzZXZ0eSI6Il...Dk0OTU2G9nIjoiaGVsbG93b3JsZC5sb2cifQ==",
    "attributes": {
     "compute.googleapis.com/resource_type": "instance",
     "compute.googleapis.com/resource_id": "123456"
    },
    "messageId": "43913662360"
   }
  }
 ]
}

If you decode the data field and format it, you get the following LogEntry object:

{
  "log": "helloworld.log",
  "insertId": "2015-04-15|11:41:00.577447-07|10.52.166.198|-1694494956",
  "textPayload": "Wed Apr 15 20:40:51 CEST 2015 Hello, world!",
  "timestamp": "2015-04-15T18:40:56Z",
  "labels": {
    "compute.googleapis.com\/resource_type": "instance",
    "compute.googleapis.com\/resource_id": "123456"
  },
  "severity": "WARNING"
  }
}

Third-party integration with Pub/Sub

Logging supports logging integration with third parties, such as Splunk. For a current list of integrations, see Partners for Google Cloud Observability integrations.

You route your logs through a Pub/Sub topic and the third party receives your logs by subscribing to the same topic.

To perform the integration, expect to do something like the following:

  1. Obtain from the third party a Google Cloud service account name created from their Google Cloud project. For example, 12345-xyz@developer.gserviceaccount.com. You use this name to give the third party permission to receive your logs.

  2. In your project containing the logs, enable the Pub/Sub API.

  3. Enable the Pub/Sub API.

    Enable the API

  4. Create a Pub/Sub topic. You can create a topic when you configure a log sink, or by following these steps:

    1. In the navigation panel of the Google Cloud console, select Pub/Sub, and then select Topics:

      Go to Topics

    2. Select Create topic and enter a topic name. For example, projects/my-project-id/topics/my-pubsub-topic. You route your logs to this topic.

      Each message sent to the topic include the timestamp of the routed log entry in the Pub/Sub message attributes; for example:

      "attributes": {
        "logging.googleapis.com/timestamp": "2018-10-01T00:00:00Z"
      }
      
    3. Click Create topic.

    4. Authorize Logging to route logs to the topic. For instructions, see Set destination permissions.

  5. Authorize the third party to subscribe to your topic:

    1. In the navigation panel of the Google Cloud console, select Pub/Sub, and then select Topics:

      Go to Topics

    2. Select your topic.
    3. Select Permissions.
    4. Select Add permission and enter the third party's service account name.
    5. In the Select a role menu, select Pub/Sub Subscriber.
    6. Click Save.
  6. Provide the third party with the name of your Pub/Sub topic; for example, projects/my-project-number/topics/my-pubsub-topic. They should subscribe to the topic before you start routing.

  7. Start routing the logs after your third party has subscribed to the topic:

    1. In the navigation panel of the Google Cloud console, select Logging, and then select Log Router:

      Go to Log Router

    2. In your project containing the logs you want to route, click Create Sink.
    3. Enter a name and description for the sink, and then click Next.
    4. In the Sink Service menu, select Cloud Pub/Sub topic.
    5. In the Cloud Pub/Sub topic menu, select the Pub/Sub topic to which the third party is subscribed, and then click Next.
    6. Select the logs to include in the sink and then click Next.
    7. Optional: Select the logs to exclude.
    8. Click Create Sink.

      A dialog with the message Sink created appears. This message indicates that your sink was successfully created with permissions to write future matching logs to the destination you selected.

Your third party should begin receiving the log entries right away.

For an exploration of common logs routing scenarios using Pub/Sub, see Scenarios for exporting Cloud Logging data: Splunk.

Troubleshooting

If logs seem to be missing from your sink's destination or you otherwise suspect that your sink isn't properly routing logs, then see Troubleshoot routing logs.

Pricing

Cloud Logging doesn't charge to route logs to a supported destination; however, the destination might apply charges. With the exception of the _Required log bucket, Cloud Logging charges to stream logs into log buckets and for storage longer than the default retention period of the log bucket.

Cloud Logging doesn't charge for copying logs, or for queries issued through the Logs Explorer page or through the Log Analytics page.

For more information, see the following documents: