Collect Synology logs

Supported in:

Overview

This parser extracts fields from Synology SYSLOG messages using grok patterns, mapping them to the UDM. It handles various log formats, identifies user logins and resource access, and categorizes events based on keywords, enriching the data with vendor and product information.

Before you begin

  • Ensure that you have a Google SecOps instance.
  • Ensure that you have privileged access to Synology DSM.

Configure a feed in Google SecOps to ingest the Synology logs

  1. Go to SIEM Settings > Feeds.
  2. Click Add new.
  3. In the Feed name field, enter a name for the feed (for example, Synology Logs).
  4. Select Webhook as the Source type.
  5. Select Synology as the Log type.
  6. Click Next.
  7. Optional: Specify values for the following input parameters:
    • Split delimiter: the delimiter that is used to separate log lines, such as \n.
    • Asset namespace: the asset namespace.
    • Ingestion labels: the label applied to the events from this feed.
  8. Click Next.
  9. Review the feed configuration in the Finalize screen, and then click Submit.
  10. Click Generate Secret Key to generate a secret key to authenticate this feed.
  11. Copy and store the secret key. You cannot view this secret key again. If needed, you can regenerate a new secret key, but this action makes the previous secret key obsolete.
  12. From the Details tab, copy the feed endpoint URL from the Endpoint Information field. You need to specify this endpoint URL in your client application.
  13. Click Done.

Create an API key for the webhook feed

  1. Go to Google Cloud console > Credentials.

    Go to Credentials

  2. Click Create credentials, and then select API key.

  3. Restrict the API key access to the Google Security Operations API.

Specify the endpoint URL

  1. In your client application, specify the HTTPS endpoint URL provided in the webhook feed.
  2. Enable authentication by specifying the API key and secret key as part of the custom header in the following format:

    X-goog-api-key = API_KEY
    X-Webhook-Access-Key = SECRET
    

    Recommendation: Specify the API key as a header instead of specifying it in the URL.

  3. If your webhook client doesn't support custom headers, you can specify the API key and secret key using query parameters in the following format:

    ENDPOINT_URL?key=API_KEY&secret=SECRET
    

    Replace the following:

    • ENDPOINT_URL: the feed endpoint URL.
    • API_KEY: the API key to authenticate to Google Security Operations.
    • SECRET: the secret key that you generated to authenticate the feed.

Creating a Webhook in Synology for Google SecOps

  1. Sign in to DiskStation Manager (DSM) on your Synology NAS.
  2. Go to Control Panel > Notification > Webhook.
  3. Click Add.
  4. Specify values for the following parameters:

    • Provider: Select Custom.
    • Rule: Select what kind of messages you want to send in your webhook.

    • Click Next.

    • Provider name: Give the webhook a distinctive name (for example, Google SecOps).

    • Subject: Will be added as a prefix of the notification message.

    • Webhook URL: Enter ENDPOINT_URL.

    • Select Send notification messages in English.

    • Click Next.

    • HTTP Method: Select POST.

    • Add Header X-Webhook-Access-Key, with SECRET value.

    • Add Header X-goog-api-key, with API_KEY value.

    • Click Apply.

  5. Click Apply to save the webhook.

UDM Mapping Table

Log field UDM mapping Logic
app target.application The value of the app field extracted by the grok filter is assigned to target.application.
desc metadata.description The value of the desc field extracted by the grok filter is assigned to metadata.description.
desc target.file.names If the desc field contains "Closed)", the file path within the parentheses is extracted and assigned to target.file.names. If the desc field contains "accessed shared folder", the folder path within the brackets is extracted and assigned to target.file.names.
host principal.hostname The value of the host field extracted by the grok filter from the host_and_ip field is assigned to principal.hostname.
host_and_ip principal.ip The host_and_ip field is parsed. If an IP address (ip1) is found, it's assigned to principal.ip. If a second IP address (ip2) is found, it's also added to principal.ip.
intermediary_host intermediary.hostname The value of the intermediary_host field extracted by the grok filter is assigned to intermediary.hostname. An empty auth object is created within extensions if the message contains "signed in" or "sign in". The timestamp from the raw log's collection_time field is used. If the message contains "signed in" or "sign in", the value is set to USER_LOGIN. If the message contains "accessed shared folder", the value is set to USER_RESOURCE_ACCESS. Otherwise, it defaults to GENERIC_EVENT. The value of the type field extracted by the grok filter is assigned to metadata.product_event_type. The value is statically set to "SYNOLOGY". The value is statically set to "SYNOLOGY". If the message contains "failed to sign", the value is set to BLOCK. If the message contains "success", the value is set to ALLOW. If the severity field (extracted by grok) is "INFO", the value is set to INFORMATIONAL.
severity security_result.severity The value of the severity field extracted by the grok filter is used to determine the security_result.severity. If the value is "INFO", it's mapped to "INFORMATIONAL".
time metadata.event_timestamp The time field, extracted by the grok filter, is parsed and converted to a timestamp. This timestamp is then assigned to metadata.event_timestamp.
type metadata.product_event_type The value of the type field extracted by the grok filter is assigned to metadata.product_event_type.
user target.administrative_domain If a domain is extracted from the user field, it's assigned to target.administrative_domain.
user target.user.userid The username part of the user field (before the "\" if present) is extracted and assigned to target.user.userid. The timestamp from the raw log's collection_time field is used.

Changes

2024-01-16

  • Newly created parser.