The scenario explains how to export Cloud Audit Logs, Google Kubernetes Engine logs, Cloud Storage logs, and BigQuery logs to Datadog. In Datadog, you can correlate these logs with metrics and distributed traces to get a view of your entire Google Cloud infrastructure.
You can access Datadog through two different domains, depending on your
location. EU users use the datadoghq.eu
domain. Non-EU users use the
datadoghq.com
domain. We provide both links in this scenario.
This setup uses Pub/Sub push subscriptions, which are incompatible with VPC Service Controls.
This scenario is part of the series Design patterns for exporting Cloud Logging.
Before you begin
-
In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.
If you don't already have a Datadog account, create one by using the Datadog free trial.
Enable the Datadog integration with Google Cloud by following the installation instructions.
Setting up the logging export
The following diagram shows the components used to export logs to Datadog through Pub/Sub. Various Google Cloud products produce logs, which are sent to Cloud Logging. These logs are filtered, and the remaining logs are sent to Datadog by using a Pub/Sub sink.
Set up a Pub/Sub topic and subscription
-
In the Cloud Console, activate Cloud Shell.
Configure Cloud Shell to use the project that you selected. Replace
PROJECT_ID
with your project ID:export PROJECT_ID=PROJECT_ID gcloud config set project ${PROJECT_ID}
Set up a Pub/Sub topic to export logs:
gcloud pubsub topics create datadog-logs-export-topic
Copy your Datadog API key from the Datadog API settings page. If you're an EU user, use the EU page. If you're non-EU user, use the default US page.
In Cloud Shell, create a subscription to send logs from the Pub/Sub topic to Datadog:
gcloud pubsub subscriptions create \ datadog-logs-export-subscription \ --topic=datadog-logs-export-topic \ --push-endpoint=https://gcp-intake.logs.datadoghq.TLD/v1/input/API_KEY/
Replace the following:
API_KEY
: your Datadog API keyTLD
:eu
if you're an EU user,com
if you're non-EU user
Turn on audit logging for all services
In Cloud Audit Logs, Admin Activity audit logs are always written and can't be disabled. However, Data Access audit logs are disabled by default (except for BigQuery, where Data Access audit logs are enabled by default and don't count against your logs allotment). In this section, you enable all audit logs for all Cloud services and users at the project level.
In Cloud Shell, read your project's Identity and Access Management (IAM) policy and save it as a file:
gcloud projects get-iam-policy ${PROJECT_ID} | tee /tmp/policy.yaml
The output is similar to the following:
bindings: - members: - user:colleague@example.com role: roles/editor - members: - user:myself@example.com role: roles/owner etag: :BwVM-FDzeYM= version: 1
Add an
auditConfigs
section to the IAM policy file that you downloaded in order to enable Data Access logs for all services and users:cat << EOF >> /tmp/policy.yaml auditConfigs: - auditLogConfigs: - logType: ADMIN_READ - logType: DATA_WRITE - logType: DATA_READ service: allServices EOF
For more configuration options, see the
AuditConfig
documentation.In Cloud Shell, apply the new IAM policy to your Cloud project:
gcloud projects set-iam-policy ${PROJECT_ID} /tmp/policy.yaml
Configure the logging export
In this section, you create a sink that exports the Cloud Audit Logs, Google Kubernetes Engine (GKE) logs, Cloud Storage logs, and BigQuery logs to the Pub/Sub topic that you previously created.
In Cloud Shell, create a log sink with a custom logging filter that matches the Cloud Audit Logs, GKE logs, Cloud Storage logs, and BigQuery logs:
gcloud logging sinks create export-logs-datadog \ pubsub.googleapis.com/projects/${PROJECT_ID}/topics/datadog-logs-export-topic \ --log-filter='logName:"/logs/cloudaudit.googleapis.com" OR resource.type:gke_cluster OR resource.type=gcs_bucket OR resource.type=bigquery_resource'
The output is similar to the following:
Created [https://logging.googleapis.com/v2/projects/PROJECT_ID/sinks/export-logs-datadog]. Please remember to grant `serviceAccount:LOGS_SINK_SERVICE_ACCOUNT` \ the Pub/Sub Publisher role on the topic. More information about sinks can be found at /logging/docs/export/configure_export
Make a note of the
LOGS_SINK_SERVICE_ACCOUNT
service account name from the preceding command output.Grant the
LOGS_SINK_SERVICE_ACCOUNT
service account permission to publish to thedatadog-logs-export-topic
topic:gcloud pubsub topics add-iam-policy-binding datadog-logs-export-topic \ --member serviceAccount:LOGS_SINK_SERVICE_ACCOUNT \ --role roles/pubsub.publisher
Now that the Google Cloud service account has the necessary permissions, the logs are shipped to Datadog. You can view these logs in the Datadog Log Explorer. If you're an EU user, view the logs at the EU link.
Stay within quotas and limits
Pub/Sub is subject to
quotas and resource limits.
If your log volume is at risk of exceeding the limit for a single subscription,
Datadog recommends splitting logs across multiple topics by using different
filters. In Datadog, you can set up a
threshold alert
so that you are automatically notified when the number of outstanding messages
in a subscription (gcp.pubsub.subscription.num_outstanding_messages
) exceeds
1000, so that you can make the necessary changes to your configuration. For
more information, see Monitoring push subscriptions.
Using the exported logs
Datadog automatically parses useful attributes from your logs, such as cluster name and project ID. You can use these attributes to drill down to subsets of logs, identify common patterns in your logs, and troubleshoot issues. You can also add filtered log streams and log analytics graphs to your dashboards. These elements help you to visualize and correlate log data with metrics and distributed traces from the rest of your Google Cloud environment.
Cleaning up
The easiest way to eliminate billing is to delete the Cloud project that you created for the tutorial. Alternatively, you can delete the individual resources.Delete the project
- In the Cloud Console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.
Delete the individual resources
After you complete this scenario, follow these steps to disable the Datadog integration with Google Cloud and revert any changes to your Cloud Logging configuration.
In your Datadog account, go to the Google Cloud integration tile and select Uninstall Integration. If you're an EU user, use the EU page. If you're a non-EU user, use the default US page.
In Cloud Shell, save your project's IAM policy as a file:
gcloud projects get-iam-policy ${PROJECT_ID} | tee /tmp/policy.yaml
Edit the
/tmp/policy.yaml
file to empty theauditConfigs
section. Do not remove this section entirely. Leave it empty. If you remove it completely, the Resource ManagersetIamPolicy
can't apply any changes to your Data Access audit logs configuration:auditConfigs:
Apply the new IAM policy to your project:
gcloud projects set-iam-policy ${PROJECT_ID} /tmp/policy.yaml
Delete the Cloud Logging sink:
gcloud logging sinks delete export-logs-datadog
Delete the Pub/Sub topic and subscription:
gcloud pubsub topics delete datadog-logs-export-topic gcloud pubsub subscriptions delete datadog-logs-export-subscription
Delete the service account that you created for the Datadog integration with Google Cloud:
In the Cloud Console, go to the Service Accounts page:
Select the service account that you created for the Datadog integration with Google Cloud.
Click Delete, and then click Delete again to confirm.
What's next
- Learn how to get started with Datadog Log Management.
- Learn how to configure Datadog to monitor your Anthos-managed infrastructure.
- See the other export scenarios:
- Try out other Google Cloud features for yourself. Have a look at our tutorials.