[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-07-24。"],[],[],null,["# View logs routed to Pub/Sub\n\nThis document explains how you can find log entries that you routed from\nCloud Logging to [Pub/Sub topics](/pubsub/docs), which occurs\nin near real-time.\n\nWe recommend using Pub/Sub for integrating\nCloud Logging logs with [third-party software](#integrate-thru-pubsub).\n\n\nWhen you route log entries to a Pub/Sub topic,\nLogging\npublishes each log entry as a Pub/Sub message as soon as\nLogging receives that log entry. Routed log entries are\ngenerally available within seconds of their arrival to Logging,\nwith 99% of log entries available in less than 60 seconds.\n\nBefore you begin\n----------------\n\nFor a conceptual discussion of sinks, see\n[Overview of routing and storage models: Sinks](/logging/docs/routing/overview#sinks).\n\nFor instructions on how to route your log entries, see\n[Route logs to supported destinations](/logging/docs/export/configure_export_v2).\n\nView logs\n---------\n\nTo view your logs as they are streamed through Pub/Sub,\ndo the following:\n\n1. In the Google Cloud console, go to the **Topics** page:\n\n [Go to **Topics**](https://console.cloud.google.com/cloudpubsub/topicList)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Pub/Sub**.\n2. Find or create a subscription to the topic used in the log sink, and pull a\n log entry from it. You might have to wait for a new log entry to be\n published.\n\nLogs organization\n-----------------\n\nThe `data` field of each message is a base64-encoded [`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry)\nobject.\nAs an example, a Pub/Sub subscriber might pull the following\nobject from a topic that is receiving log entries.\nThe object shown contains a list with a single message, although\nPub/Sub might return several messages if several log entries are\navailable.\nThe `data` value (about 600 characters) and the `ackId` value\n(about 200 characters) have been shortened to make the example easier to read: \n\n```\n{\n \"receivedMessages\": [\n {\n \"ackId\": \"dR1JHlAbEGEIBERNK0EPKVgUWQYyODM...QlVWBwY9HFELH3cOAjYYFlcGICIjIg\",\n \"message\": {\n \"data\": \"eyJtZXRhZGF0YSI6eyJzZXZ0eSI6Il...Dk0OTU2G9nIjoiaGVsbG93b3JsZC5sb2cifQ==\",\n \"attributes\": {\n \"compute.googleapis.com/resource_type\": \"instance\",\n \"compute.googleapis.com/resource_id\": \"123456\"\n },\n \"messageId\": \"43913662360\"\n }\n }\n ]\n}\n```\n\nIf you decode the `data` field and format it, you get the following\n[`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry) object: \n\n```\n{\n \"log\": \"helloworld.log\",\n \"insertId\": \"2015-04-15|11:41:00.577447-07|10.52.166.198|-1694494956\",\n \"textPayload\": \"Wed Apr 15 20:40:51 CEST 2015 Hello, world!\",\n \"timestamp\": \"2015-04-15T18:40:56Z\",\n \"labels\": {\n \"compute.googleapis.com\\/resource_type\": \"instance\",\n \"compute.googleapis.com\\/resource_id\": \"123456\"\n },\n \"severity\": \"WARNING\"\n }\n}\n```\n\n\u003cbr /\u003e\n\nThird-party integration with Pub/Sub\n------------------------------------\n\nYou route your log entries to a Pub/Sub topic. The third\nparty receives your log entries by subscribing to the same topic.\nLogging supports logging integration with third parties, such\nas Splunk or Datadog. For a current list of integrations,\nsee [Partners](/products/observability#partners) for Google Cloud Observability integrations.\n\nTo perform the integration, expect to do something like the following:\n\n1. In your project where your log entries originate, create\n your Pub/Sub topic with a\n [default subscription](/pubsub/docs/create-topic#properties_of_a_topic):\n\n 1. Enable the Pub/Sub API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=pubsub&redirect=https://console.cloud.google.com/cloudpubsub/topicList)\n 2. In the Google Cloud console, go to the **Topics** page:\n\n [Go to **Topics**](https://console.cloud.google.com/cloudpubsub/topicList)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Pub/Sub**.\n 3. Click **Create topic**.\n\n 4. In the **Topic ID** field, enter an ID for your topic. For example,\n `projects/my-project-id/topics/my-pubsub-topic`.\n\n Each message sent to the topic include the timestamp of the routed\n log entry in the Pub/Sub message `attributes`; for\n example: \n\n \"attributes\": {\n \"logging.googleapis.com/timestamp\": \"2024-07-01T00:00:00Z\"\n }\n\n 5. Retain the option **Add a default subscription.** Don't select any other\n option.\n\n 6. Click **Create topic**.\n\n2. In your project where your log entries originate, configure\n Logging to route log entries to your topic:\n\n 1. In the Google Cloud console, go to the **Log Router** page:\n\n [Go to **Log Router**](https://console.cloud.google.com/logs/router)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n 2. Click **Create Sink** , enter a name and description for the sink, and then click **Next**.\n 3. In the **Sink Service** menu, select **Cloud Pub/Sub topic** , select the Pub/Sub topic, and then click **Next**.\n 4. Select the log entries to include in the sink and then click **Next**.\n 5. Optional: Select the log entries to exclude.\n 6. Click **Create Sink**.\n\n A dialog with the message **Sink created** appears. This\n message indicates that your sink was successfully created with\n permissions to route future matching log entries to the destination\n you selected.\n 7. Grant the role of\n [Pub/Sub Publisher (`roles/pubsub.publisher`)](/iam/docs/understanding-roles#pubsub.publisher)\n to the writer identity of the sink. For more information about obtaining\n the writer identity and granting a role, see\n [Set destination permissions](/logging/docs/export/configure_export_v2#dest-auth).\n\n Cloud Logging is now sending log entries to your Pub/Sub\n topic.\n3. Create the subscription.\n\n For example, if you use Dataflow to\n pull the data from your Pub/Sub topic and send it to\n [Datadog](https://docs.datadoghq.com/integrations/google_cloud_platform/?tab=project#log-collection), then you need to perform two steps:\n 1. Create, or obtain, a service account, and then grant it the\n IAM roles necessary to subscribe to your topic. At a\n minimum, the service account requires the following roles:\n\n - [Pub/Sub Subscriber (`roles/pubsub.subscriber`)](/iam/docs/understanding-roles#pubsub.subscriber)\n - [Dataflow Admin (`roles/dataflow.admin`)](/iam/docs/understanding-roles#dataflow.admin)\n - [Dataflow Worker (`roles/dataflow.worker`)](/iam/docs/understanding-roles#dataflow.worker)\n\n For more information, see the following documents:\n - [Create a service account](/iam/docs/service-accounts-create).\n - [Grant the service account the necessary roles](/iam/docs/create-service-agents#grant-role).\n 2. Create a job from a template, and then run that job.\n For this example, you would use the\n [Pub/Sub to Datadog template](/dataflow/docs/guides/templates/provided/pubsub-to-datadog).\n\nYour third party should begin receiving the log entries right away.\n\nFor an exploration of common logs routing scenarios using\nPub/Sub, see\n[Scenarios for exporting Cloud Logging data: Splunk](/solutions/exporting-stackdriver-logging-for-splunk).\n\nTroubleshooting\n---------------\n\nIf log entries seem to be missing from your sink's destination or you otherwise\nsuspect that your sink isn't properly routing log entries, then see\n[Troubleshoot routing logs](/logging/docs/export/troubleshoot).\n\nPricing\n-------\n\nCloud Logging doesn't charge to route logs to a\nsupported destination; however, the destination might apply charges.\nWith the exception of the `_Required` log bucket,\nCloud Logging charges to stream logs into log buckets and\nfor storage longer than the default retention period of the log bucket.\n\nCloud Logging doesn't charge for copying logs,\nfor creating [log scopes](/logging/docs/log-scope/create-and-manage)\nor [analytics views](/logging/docs/analyze/about-analytics-views),\nor for queries issued through the\n**Logs Explorer** or **Log Analytics** pages.\n\nFor more information, see the following documents:\n\n- The Cloud Logging sections of the [Google Cloud Observability pricing](https://cloud.google.com/stackdriver/pricing) page.\n- Costs when routing log data to other Google Cloud services:\n\n - [Cloud Storage pricing](https://cloud.google.com/storage/pricing)\n - [BigQuery pricing](https://cloud.google.com/bigquery/pricing#data_ingestion_pricing)\n - [Pub/Sub pricing](https://cloud.google.com/pubsub/pricing)\n- [VPC flow log generation charges](https://cloud.google.com/vpc/network-pricing#network-telemetry) apply when you send and then exclude your Virtual Private Cloud flow logs from Cloud Logging."]]