Scenarios for exporting Stackdriver Logging: Splunk

This scenario shows how to export selected logs from Stackdriver Logging to Cloud Pub/Sub for ingestion into Splunk. Splunk is a security information and event management (SIEM) solution that provides the Splunk Add-on for Google Cloud Platform (GCP). This add-on includes the ability to ingest logs, events, and billing information from GCP. Using this add-on, you can export the logs from GCP to a Splunk installation.

If you are using Splunk solutions that are already deployed, Stackdriver Logging lets you export logs from GCP into your Splunk solution. This ability helps you take advantage of the native logging, monitoring, and diagnostics capabilities while still enabling you to include these logs in your existing systems.

This scenario is part of the series Design patterns for exporting Stackdriver Logging.

Set up the logging export

The following diagram shows the steps for enabling logging export to Splunk through Cloud Pub/Sub.

Enabling logging export to Cloud Pub/Sub.

Set up a Cloud Pub/Sub topic

Follow the instructions to set up a Cloud Pub/Sub topic that will receive your exported logs.

Turn on audit logging for all services

Data access audit logs—except for BigQuery—are disabled by default. In order to enable all audit logs, follow the instructions to update the Cloud IAM policy with the configuration listed in the audit policy documentation. The steps include the following:

  • Downloading the current IAM policy as a file.
  • Adding the audit log policy JSON or YAML object to the current policy file.
  • Updating the Google Cloud Platform project with the changed policy file.

The following is an example JSON object that enables all audit logs for all services.

"auditConfigs": [
        "service": "allServices",
        "auditLogConfigs": [
            { "logType": "ADMIN_READ" },
            { "logType": "DATA_READ"  },
            { "logType": "DATA_WRITE" },

Configure the logging export

After you set up aggregated exports or logs export, you need to refine the logging filters to export audit logs, virtual machine–related logs, storage logs, and database logs. The following logging filter includes the Admin Activity and Data Access audit logs and the logs for specific resource types.

logName:"/logs/" OR
resource.type:gce OR
resource.type=gcs_bucket OR

From the gcloud command-line tool, use the gcloud logging sinks create command or the organizations.sinks.create API call to create a sink with the appropriate filters. The following example gcloud command creates a sink called gcp_logging_sink_pubsub for the organization. The sink includes all children projects and specifies filtering to select specific audit logs.

gcloud logging sinks create gcp_logging_sink_pubsub \ \
    --log-filter='logName:"/logs/" OR \
    resource.type:\"gce\" OR \
    resource.type=\"gcs_bucket\" OR   \
    resource.type=\"bigquery_resource\"' \
    --include-children   \

The command output is similar to the following:

Created [].
Please remember to grant `` Pub/Sub Publisher role to the topic.
More information about sinks can be found at /logging/docs/export/configure_export

In the serviceAccount entry returned from the API call, the identity is included in the response. This identity represents a GCP service account that has been created for the export. Until you grant this identity publish access to the destination topic, log entry exports from this sink will fail. For more information, see the next section or the documentation for Granting access for a resource.

Set IAM policy permissions for the Cloud Pub/Sub topic

By adding the service account to the topic with the Pub/Sub Publisher permissions, you grant the service account permission to publish to the topic. Until you add these permissions, the sink export will fail.

To add the permissions to the service account, follow these steps:

  1. In the GCP Console, open the Cloud Pub/Sub Topics page:


  2. Select the topic name.

  3. Click Show info panel, and then select the Pub/Sub Publisher permissions.

    IAM policy permissions - Pub/Sub Publisher.

After you create the logging export by using this filter, log files begin to populate in the Cloud Pub/Sub topic in the configured project. You can confirm that the topic is receiving messages by using the Metrics Explorer in Stackdriver Monitoring. Using the following resource type and metric, observe the number of message-send operations over a brief period. If you have configured the export properly, you will see activity above 0 on the graph, as in this screenshot.

  • Resource type: pubsub_topic
  • Metric: pubsub/topic/send_message_operation_count

Activity graph.

Configure the Splunk Add-on for GCP

The Splunk Add-on for GCP uses the Cloud Pub/Sub topic and a service account in GCP. The service account is used to generate a private key that the add-on uses to establish a Cloud Pub/Sub subscription and ingest messages from the logging export topic. The appropriate IAM permissions are required to allow the service account to create the subscription and list the components in the Cloud Pub/Sub project that contains the subscription.

Follow the instructions to set up Splunk Add-on. After you configure the add-on, the Cloud Pub/Sub messages from the logging export appear in Splunk.

By using Metrics Explorer in Stackdriver Monitoring, you can confirm that the subscription that the Splunk add-on is using is pulling messages. Using the following resource type and metric, observe the number of message-pull operations over a brief period.

  • Resource type: pubsub_subscription
  • Metric: pubsub/subscription/pull_message_operation_count

If you have configured the export properly, you see activity above 0 on the graph, as in this screenshot.

graph activity for pull operations.

Using the exported logs

After the exported logs have been ingested into Splunk, you can use Splunk as you would with any other data source to do the following tasks:

  • Search the logs.
  • Correlate complex events.
  • Visualize results by using dashboards.

What's next

Was this page helpful? Let us know how we did:

Send feedback about...