Collect Akamai DNS logs

Supported in:

This parser processes Akamai DNS logs. It extracts fields like timestamps, source IP and port, query, DNS record type, and response details. It then maps these fields to the UDM, handling various DNS record types and potential SPF records. The parser classifies the event as either NETWORK_DNSor GENERIC_EVENT based on the presence of principal information.

Before you begin

  • Ensure that you have a Google SecOps instance.
  • Ensure that you have privileged access to AWS IAM and S3.
  • Ensure your Akamai account has access to the Log Delivery Service.

Configure an Amazon S3 bucket

  1. Create an Amazon S3 bucket following this user guide: Creating a bucket
  2. Save the bucket Name and Region for future reference.
  3. Create a User following this user guide: Creating an IAM user.
  4. Select the created User.
  5. Select the Security credentials tab.
  6. Click Create Access Key in the Access Keys section.
  7. Select Third-party service as the Use case.
  8. Click Next.
  9. Optional: Add a description tag.
  10. Click Create access key.
  11. Click Download .csv file and save the Access Key and Secret Access Key for future reference.
  12. Click Done.
  13. Select the Permissions tab.
  14. Click Add permissions in the Permissions policies section.
  15. Select Add permissions.
  16. Select Attach policies directly.
  17. Search for and select the AmazonS3FullAccess policy.
  18. Click Next.
  19. Click Add permissions.

Configure Log Delivery Service in Akamai

  1. Sign in to the Akamai Control Center.
  2. Go to Log Delivery Service under Data Services.
  3. Click Add New Configuration.
  4. In the Configuration Name field, provide a name for your configuration (for example, Edge DNS Logs to S3).
  5. Select Edge DNS as the Log Source.
  6. Select AWS S3 as the Delivery Target.
  7. Provide the following details:
    • Bucket Name: the name of your S3 bucket.
    • Region: the AWS region where your bucket is hosted.
    • Access Key ID: the IAM user Access Key ID.
    • Secret Access Key: the IAM user Secret Access Key.
    • Optional: specify the Directory Structure. (for example: logs/akamai-dns/YYYY/MM/DD/HH/).
    • Optional: set the File Naming Convention. (for example: edge-dns-logs-{timestamp}.log).
  8. Select the Log Formats you want to include:
    • DNS Queries
    • DNS Responses
  9. Choose the Delivery Frequency:
    • Options include hourly, daily, or upon reaching a certain file size (for example, 100MB).
  10. Optional: Click Add Filters to include or exclude specific logs based on specific criteria (for example, hostname or record type).
  11. Review the configuration details and click Save and Activate.

Configure a feed in Google SecOps to ingest Akamai DNS logs

  1. Go to SIEM Settings > Feeds.
  2. Click Add new.
  3. In the Feed name field, enter a name for the feed (for example, Akamai DNS Logs).
  4. Select Amazon S3 as the Source type.
  5. Select Akamai DNS as the Log type.
  6. Click Next.
  7. Specify values for the following input parameters:

    • Region: the region where the Amazon S3 bucket is located.
    • S3 URI: the bucket URI.

      • s3://BUCKET_NAME

      Replace the following:

      • BUCKET_NAME: the name of the bucket.
    • URI is a: select the URI_TYPE according to log stream configuration (Single file | Directory | Directory which includes subdirectories).

    • Source deletion option: select deletion option according to your preference.

    • Access Key ID: the User access key with access to the s3 bucket.

    • Secret Access Key: the User secret key with access to the s3 bucket.

    • Asset namespace: the asset namespace.

    • Ingestion labels: the label to be applied to the events from this feed.

  8. Click Next.

  9. Review your new feed configuration in the Finalize screen, and then click Submit.

UDM Mapping Table

Log Field UDM Mapping Logic
class read_only_udm.network.dns.questions.class If class is "IN", set to 1. Otherwise, attempt conversion to unsigned integer.
column11 read_only_udm.target.hostname Mapped if it contains a hostname and doesn't contain specific patterns like "ip4", "=", ".net", or "10 mx0". Also used for extracting IP addresses, email addresses, and DNS authority data based on various patterns.
column11 read_only_udm.target.ip Extracted from column11 if it matches the pattern for IP addresses within SPF records.
column11 read_only_udm.target.user.email_addresses Extracted from column11 if it matches the pattern for email addresses within DMARC records.
column11 read_only_udm.network.dns.authority.data Extracted from column11 if it matches patterns for domain names within various record types.
column11 read_only_udm.network.dns.response_code Set to 3 if column11 contains "NXDOMAIN".
column2 read_only_udm.principal.ip Mapped if it is a valid IP address.
column3 read_only_udm.principal.port Mapped if it is a valid integer.
column4 read_only_udm.network.dns.questions.name Directly mapped.
column6 read_only_udm.network.dns.questions.type Mapped based on the value of type, using conditional logic to assign the corresponding numerical value.
column8 read_only_udm.network.sent_bytes Converted to an unsigned integer and mapped.
read_only_udm.metadata.event_timestamp Constructed from the date and time fields extracted from column1.
read_only_udm.event_type Set to NETWORK_DNS if principal.ip is present, otherwise set to GENERIC_EVENT.
read_only_udm.product_name Hardcoded to AKAMAI_DNS.
read_only_udm.vendor_name Hardcoded to AKAMAI_DNS.
read_only_udm.dataset Hardcoded to AKAMAI_DNS.
read_only_udm.event_subtype Hardcoded to DNS.

Changes

2024-05-28

  • Bug fix: Added a gsub function to remove double quotes from the log message.
  • Added Grok patterns to validate IP address and port values before mapping.