Collect Snipe-IT logs

Supported in:

This document explains how to ingest Snipe-IT logs to Google Security Operations using Amazon S3.

Before you begin

Make sure you have the following prerequisites:

  • A Google SecOps instance.
  • Privileged access to the Snipe-IT tenant.
  • Privileged access to AWS (S3, Identity and Access Management (IAM), Lambda, EventBridge).

Collect Snipe-IT prerequisites (API token and base URL)

  1. Sign in to Snipe-IT.
  2. Open your user menu (top-right avatar) and click Manage API keys.
  3. Click Create New API Key:
    • Name/Label: Enter a descriptive label (for example, Google SecOps export).
    • Click Generate.
  4. Copy the API token (it will be shown only once). Store it securely.
  5. Determine your API base URL, typically:
    • https://<your-domain>/api/v1
    • Example: https://snipeit.example.com/api/v1

Configure AWS S3 bucket and IAM for Google SecOps

  1. Create Amazon S3 bucket following this user guide: Creating a bucket
  2. Save bucket Name and Region for future reference (for example, snipe-it-logs).
  3. Create a User following this user guide: Creating an IAM user.
  4. Select the created User.
  5. Select Security credentials tab.
  6. Click Create Access Key in section Access Keys.
  7. Select Third-party service as Use case.
  8. Click Next.
  9. Optional: Add a description tag.
  10. Click Create access key.
  11. Click Download CSV file to save the Access Key and Secret Access Key for future reference.
  12. Click Done.
  13. Select Permissions tab.
  14. Click Add permissions in section Permissions policies.
  15. Select Add permissions.
  16. Select Attach policies directly.
  17. Search for AmazonS3FullAccess policy.
  18. Select the policy.
  19. Click Next.
  20. Click Add permissions.

Configure the IAM policy and role for S3 uploads

  1. In the AWS console, go to IAM > Policies.
  2. Click Create policy > JSON tab.
  3. Copy and paste the following policy.
  4. Policy JSON (replace snipe-it-logs if you entered a different bucket name):

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Sid": "AllowPutObjects",
          "Effect": "Allow",
          "Action": "s3:PutObject",
          "Resource": "arn:aws:s3:::snipe-it-logs/*"
        },
        {
          "Sid": "AllowGetStateObject",
          "Effect": "Allow",
          "Action": "s3:GetObject",
          "Resource": "arn:aws:s3:::snipe-it-logs/snipeit/state.json"
        }
      ]
    }
    
  5. Click Next > Create policy.

  6. Go to IAM > Roles > Create role > AWS service > Lambda.

  7. Attach the newly created policy.

  8. Name the role SnipeITToS3Role and click Create role.

Create the Lambda function

  1. In the AWS Console, go to Lambda > Functions > Create function.
  2. Click Author from scratch.
  3. Provide the following configuration details:

    Setting Value
    Name snipeit_assets_to_s3
    Runtime Python 3.13
    Architecture x86_64
    Execution role SnipeITToS3Role
  4. After the function is created, open the Code tab, delete the stub and paste the following code (snipeit_assets_to_s3.py).

    #!/usr/bin/env python3
    # Lambda: Pull Snipe-IT hardware (assets) via REST API and write raw JSON pages to S3 (no transform)
    
    import os, json, time, urllib.parse
    from urllib.request import Request, urlopen
    import boto3
    
    BASE = os.environ["SNIPE_BASE_URL"].rstrip("/")  # e.g. https://snipeit.example.com/api/v1
    TOKEN = os.environ["SNIPE_API_TOKEN"]
    BUCKET = os.environ["S3_BUCKET"]
    PREFIX = os.environ.get("S3_PREFIX", "snipeit/assets/")
    PAGE_SIZE = int(os.environ.get("PAGE_SIZE", "500"))  # Snipe-IT max 500 per request
    MAX_PAGES = int(os.environ.get("MAX_PAGES", "200"))
    
    s3 = boto3.client("s3")
    
    def _headers():
        return {"Authorization": f"Bearer {TOKEN}", "Accept": "application/json"}
    
    def fetch_page(offset: int) -> dict:
        params = {"limit": PAGE_SIZE, "offset": offset, "sort": "id", "order": "asc"}
        qs = urllib.parse.urlencode(params)
        url = f"{BASE}/hardware?{qs}"
        req = Request(url, method="GET", headers=_headers())
        with urlopen(req, timeout=60) as r:
            return json.loads(r.read().decode("utf-8"))
    
    def write_page(payload: dict, ts: float, page: int) -> str:
        key = f"{PREFIX}/{time.strftime('%Y/%m/%d', time.gmtime(ts))}/snipeit-hardware-{page:05d}.json"
        body = json.dumps(payload, separators=(",", ":")).encode("utf-8")
        s3.put_object(Bucket=BUCKET, Key=key, Body=body, ContentType="application/json")
        return key
    
    def lambda_handler(event=None, context=None):
        ts = time.time()
        offset = 0
        page = 0
        total = 0
    
        while page < MAX_PAGES:
            data = fetch_page(offset)
            rows = data.get("rows") or data.get("data") or []
            write_page(data, ts, page)
            total += len(rows)
            if len(rows) < PAGE_SIZE:
                break
            page += 1
            offset += PAGE_SIZE
    
        return {"ok": True, "pages": page + 1, "objects": total}
    
    if __name__ == "__main__":
        print(lambda_handler())
    
  5. Go to Configuration > Environment variables.

  6. Click Edit > Add new environment variable.

  7. Enter the environment variables provided in the following table replacing the example values with your values.

    Environment variables

    Key Example value
    S3_BUCKET snipe-it-logs
    S3_PREFIX snipeit/assets/
    SNIPE_BASE_URL https://snipeit.example.com/api/v1
    SNIPE_API_TOKEN <your-api-token>
    PAGE_SIZE 500
    MAX_PAGES 200
  8. After the function is created, stay on its page (or open Lambda > Functions > your-function).

  9. Select the Configuration tab.

  10. In the General configuration panel, click Edit.

  11. Change Timeout to 5 minutes (300 seconds) and click Save.

Create an EventBridge schedule

  1. Go to Amazon EventBridge > Scheduler > Create schedule.
  2. Provide the following configuration details:
    • Recurring schedule: Rate (1 hour).
    • Target: Your Lambda function snipeit_assets_to_s3.
    • Name: snipeit_assets_to_s3-1h.
  3. Click Create schedule.

(Optional) Create read-only IAM user and keys for Google SecOps

  1. Go to AWS Console > IAM > Users.
  2. Click Add users.
  3. Provide the following configuration details:
    • User: Enter secops-reader.
    • Access type: Select Access key â€" Programmatic access.
  4. Click Create user.
  5. Attach minimal read policy (custom): Users > secops-reader > Permissions > Add permissions > Attach policies directly > Create policy.
  6. JSON:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Action": ["s3:GetObject"],
          "Resource": "arn:aws:s3:::snipe-it-logs/*"
        },
        {
          "Effect": "Allow",
          "Action": ["s3:ListBucket"],
          "Resource": "arn:aws:s3:::snipe-it-logs"
        }
      ]
    }
    
  7. Name = secops-reader-policy.

  8. Click Create policy > search/select > Next > Add permissions.

  9. Create access key for secops-reader: Security credentials > Access keys.

  10. Click Create access key.

  11. Download the .CSV. (You'll paste these values into the feed).

Configure a feed in Google SecOps to ingest Snipe-IT logs

  1. Go to SIEM Settings > Feeds.
  2. Click + Add New Feed.
  3. In the Feed name field, enter a name for the feed (for example, Snipe-IT logs).
  4. Select Amazon S3 V2 as the Source type.
  5. Select Snipe-IT as the Log type.
  6. Click Next.
  7. Specify values for the following input parameters:
    • S3 URI: s3://snipe-it-logs/snipeit/assets/
    • Source deletion options: Select deletion option according to your preference.
    • Maximum File Age: Include files modified in the last number of days. Default is 180 days.
    • Access Key ID: User access key with access to the S3 bucket.
    • Secret Access Key: User secret key with access to the S3 bucket.
    • Asset namespace: The asset namespace.
    • Ingestion labels: The label applied to the events from this feed.
  8. Click Next.
  9. Review your new feed configuration in the Finalize screen, and then click Submit.

Need more help? Get answers from Community members and Google SecOps professionals.