이 문서에서는 거의 실시간으로 발생하는, Cloud Logging에서 Pub/Sub 주제로 라우팅하는 로그 항목을 찾는 방법을 설명합니다.
Cloud Logging 로그를 서드 파티 소프트웨어와 통합할 때는 Pub/Sub를 사용하는 것이 좋습니다.
Pub/Sub 주제로 로그 항목을 라우팅하면 Logging에서 해당 로그 항목을 받는 즉시 Pub/Sub 메시지로 각 로그 항목을 게시합니다. 라우팅된 로그 항목은 일반적으로 Logging에 도착한 후 몇 초 이내에 사용 가능하며 로그 항목의 99% 는 60초 이내에 사용 가능합니다.
로그 싱크에 사용된 주제에 대한 구독을 찾거나 만든 후에, 여기에서 로그 항목을 가져옵니다. 새로운 로그 항목이 게시될 때까지 기다려야 할 수도 있습니다.
로그 구성
각 메시지의 data 필드는 base64로 인코딩된 LogEntry 객체입니다.
예를 들어 Pub/Sub 구독자는 로그 항목을 수신하는 주제에서 다음과 같은 객체를 가져올 수 있습니다.
여기에 나와 있는 객체에는 하나의 메시지가 있는 목록이 포함되어 있습니다. 그러나 로그 항목이 여러 개인 경우에는 Pub/Sub에서 여러 메시지를 반환할 수 있습니다.
예시를 쉽게 읽을 수 있도록 data 값(약 600자)과 ackId 값(약 200자)이 단축 표시되었습니다.
로그 항목을 Pub/Sub 주제로 라우팅합니다. 서드 파티는 같은 주제를 구독하여 로그 항목을 수신합니다.
Logging은 Splunk 또는 Datadog와 같은 서드 파티와 로깅을 통합할 수 있습니다. 현재 통합 목록은 파트너에서 Google Cloud Observability 통합을 참조하세요.
로그 항목이 싱크 대상에서 누락된 것으로 보이거나 싱크에서 로그 항목을 올바르게 라우팅하지 않는 것으로 의심되는 경우에는 로그 라우팅 문제 해결을 참조하세요.
가격 책정
Cloud Logging은 로그를 지원되는 목적지로 라우팅하는 데 비용을 청구하지 않지만 목적지에 요금이 부과될 수 있습니다.
_Required 로그 버킷을 제외하고 Cloud Logging은 로그를 로그 버킷으로 스트리밍하고 로그 버킷의 기본 보관 기간보다 긴 스토리지에 대해 요금을 청구합니다.
Cloud Logging에서는 로그 복사, 로그 범위 또는 분석 뷰 생성, 로그 탐색기 또는 로그 애널리틱스 페이지를 통해 실행되는 쿼리에 대한 요금을 청구하지 않습니다.
[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-09-09(UTC)"],[],[],null,["# View logs routed to Pub/Sub\n\nThis document explains how you can find log entries that you routed from\nCloud Logging to [Pub/Sub topics](/pubsub/docs), which occurs\nin near real-time.\n\nWe recommend using Pub/Sub for integrating\nCloud Logging logs with [third-party software](#integrate-thru-pubsub).\n\n\nWhen you route log entries to a Pub/Sub topic,\nLogging\npublishes each log entry as a Pub/Sub message as soon as\nLogging receives that log entry. Routed log entries are\ngenerally available within seconds of their arrival to Logging,\nwith 99% of log entries available in less than 60 seconds.\n\nBefore you begin\n----------------\n\nFor a conceptual discussion of sinks, see\n[Overview of routing and storage models: Sinks](/logging/docs/routing/overview#sinks).\n\nFor instructions on how to route your log entries, see\n[Route logs to supported destinations](/logging/docs/export/configure_export_v2).\n\nView logs\n---------\n\nTo view your logs as they are streamed through Pub/Sub,\ndo the following:\n\n1. In the Google Cloud console, go to the **Topics** page:\n\n [Go to **Topics**](https://console.cloud.google.com/cloudpubsub/topicList)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Pub/Sub**.\n2. Find or create a subscription to the topic used in the log sink, and pull a\n log entry from it. You might have to wait for a new log entry to be\n published.\n\nLogs organization\n-----------------\n\nThe `data` field of each message is a base64-encoded [`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry)\nobject.\nAs an example, a Pub/Sub subscriber might pull the following\nobject from a topic that is receiving log entries.\nThe object shown contains a list with a single message, although\nPub/Sub might return several messages if several log entries are\navailable.\nThe `data` value (about 600 characters) and the `ackId` value\n(about 200 characters) have been shortened to make the example easier to read: \n\n```\n{\n \"receivedMessages\": [\n {\n \"ackId\": \"dR1JHlAbEGEIBERNK0EPKVgUWQYyODM...QlVWBwY9HFELH3cOAjYYFlcGICIjIg\",\n \"message\": {\n \"data\": \"eyJtZXRhZGF0YSI6eyJzZXZ0eSI6Il...Dk0OTU2G9nIjoiaGVsbG93b3JsZC5sb2cifQ==\",\n \"attributes\": {\n \"compute.googleapis.com/resource_type\": \"instance\",\n \"compute.googleapis.com/resource_id\": \"123456\"\n },\n \"messageId\": \"43913662360\"\n }\n }\n ]\n}\n```\n\nIf you decode the `data` field and format it, you get the following\n[`LogEntry`](/logging/docs/reference/v2/rest/v2/LogEntry) object: \n\n```\n{\n \"log\": \"helloworld.log\",\n \"insertId\": \"2015-04-15|11:41:00.577447-07|10.52.166.198|-1694494956\",\n \"textPayload\": \"Wed Apr 15 20:40:51 CEST 2015 Hello, world!\",\n \"timestamp\": \"2015-04-15T18:40:56Z\",\n \"labels\": {\n \"compute.googleapis.com\\/resource_type\": \"instance\",\n \"compute.googleapis.com\\/resource_id\": \"123456\"\n },\n \"severity\": \"WARNING\"\n }\n}\n```\n\n\u003cbr /\u003e\n\nThird-party integration with Pub/Sub\n------------------------------------\n\nYou route your log entries to a Pub/Sub topic. The third\nparty receives your log entries by subscribing to the same topic.\nLogging supports logging integration with third parties, such\nas Splunk or Datadog. For a current list of integrations,\nsee [Partners](/products/observability#partners) for Google Cloud Observability integrations.\n\nTo perform the integration, expect to do something like the following:\n\n1. In your project where your log entries originate, create\n your Pub/Sub topic with a\n [default subscription](/pubsub/docs/create-topic#properties_of_a_topic):\n\n 1. Enable the Pub/Sub API.\n\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=pubsub&redirect=https://console.cloud.google.com/cloudpubsub/topicList)\n 2. In the Google Cloud console, go to the **Topics** page:\n\n [Go to **Topics**](https://console.cloud.google.com/cloudpubsub/topicList)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Pub/Sub**.\n 3. Click **Create topic**.\n\n 4. In the **Topic ID** field, enter an ID for your topic. For example,\n `projects/my-project-id/topics/my-pubsub-topic`.\n\n Each message sent to the topic include the timestamp of the routed\n log entry in the Pub/Sub message `attributes`; for\n example: \n\n \"attributes\": {\n \"logging.googleapis.com/timestamp\": \"2024-07-01T00:00:00Z\"\n }\n\n 5. Retain the option **Add a default subscription.** Don't select any other\n option.\n\n 6. Click **Create topic**.\n\n2. In your project where your log entries originate, configure\n Logging to route log entries to your topic:\n\n 1. In the Google Cloud console, go to the **Log Router** page:\n\n [Go to **Log Router**](https://console.cloud.google.com/logs/router)\n\n \u003cbr /\u003e\n\n If you use the search bar to find this page, then select the result whose subheading is\n **Logging**.\n 2. Click **Create Sink** , enter a name and description for the sink, and then click **Next**.\n 3. In the **Sink Service** menu, select **Cloud Pub/Sub topic** , select the Pub/Sub topic, and then click **Next**.\n 4. Select the log entries to include in the sink and then click **Next**.\n 5. Optional: Select the log entries to exclude.\n 6. Click **Create Sink**.\n\n A dialog with the message **Sink created** appears. This\n message indicates that your sink was successfully created with\n permissions to route future matching log entries to the destination\n you selected.\n 7. Grant the role of\n [Pub/Sub Publisher (`roles/pubsub.publisher`)](/iam/docs/understanding-roles#pubsub.publisher)\n to the writer identity of the sink. For more information about obtaining\n the writer identity and granting a role, see\n [Set destination permissions](/logging/docs/export/configure_export_v2#dest-auth).\n\n Cloud Logging is now sending log entries to your Pub/Sub\n topic.\n3. Create the subscription.\n\n For example, if you use Dataflow to\n pull the data from your Pub/Sub topic and send it to\n [Datadog](https://docs.datadoghq.com/integrations/google_cloud_platform/?tab=project#log-collection), then you need to perform two steps:\n 1. Create, or obtain, a service account, and then grant it the\n IAM roles necessary to subscribe to your topic. At a\n minimum, the service account requires the following roles:\n\n - [Pub/Sub Subscriber (`roles/pubsub.subscriber`)](/iam/docs/understanding-roles#pubsub.subscriber)\n - [Dataflow Admin (`roles/dataflow.admin`)](/iam/docs/understanding-roles#dataflow.admin)\n - [Dataflow Worker (`roles/dataflow.worker`)](/iam/docs/understanding-roles#dataflow.worker)\n\n For more information, see the following documents:\n - [Create a service account](/iam/docs/service-accounts-create).\n - [Grant the service account the necessary roles](/iam/docs/create-service-agents#grant-role).\n 2. Create a job from a template, and then run that job.\n For this example, you would use the\n [Pub/Sub to Datadog template](/dataflow/docs/guides/templates/provided/pubsub-to-datadog).\n\nYour third party should begin receiving the log entries right away.\n\nFor an exploration of common logs routing scenarios using\nPub/Sub, see\n[Scenarios for exporting Cloud Logging data: Splunk](/solutions/exporting-stackdriver-logging-for-splunk).\n\nTroubleshooting\n---------------\n\nIf log entries seem to be missing from your sink's destination or you otherwise\nsuspect that your sink isn't properly routing log entries, then see\n[Troubleshoot routing logs](/logging/docs/export/troubleshoot).\n\nPricing\n-------\n\nCloud Logging doesn't charge to route logs to a\nsupported destination; however, the destination might apply charges.\nWith the exception of the `_Required` log bucket,\nCloud Logging charges to stream logs into log buckets and\nfor storage longer than the default retention period of the log bucket.\n\nCloud Logging doesn't charge for copying logs,\nfor creating [log scopes](/logging/docs/log-scope/create-and-manage)\nor [analytics views](/logging/docs/analyze/about-analytics-views),\nor for queries issued through the\n**Logs Explorer** or **Log Analytics** pages.\n\nFor more information, see the following documents:\n\n- The Cloud Logging sections of the [Google Cloud Observability pricing](https://cloud.google.com/stackdriver/pricing) page.\n- Costs when routing log data to other Google Cloud services:\n\n - [Cloud Storage pricing](https://cloud.google.com/storage/pricing)\n - [BigQuery pricing](https://cloud.google.com/bigquery/pricing#data_ingestion_pricing)\n - [Pub/Sub pricing](https://cloud.google.com/pubsub/pricing)\n- [VPC flow log generation charges](https://cloud.google.com/vpc/network-pricing#network-telemetry) apply when you send and then exclude your Virtual Private Cloud flow logs from Cloud Logging."]]