Collect Aware audit logs

Supported in:

This document explains how to ingest Aware audit logs to Google Security Operations using Amazon S3.

Before you begin

Make sure you have the following prerequisites:

  • Google SecOps instance
  • Privileged access to Aware tenant
  • Privileged access to AWS (S3, IAM, Lambda, EventBridge)

Collect Aware prerequisites (IDs, API keys, org IDs, tokens)

  1. Sign in to the Aware Admin Console.
  2. Go to System Settings > Integrations > API Tokens.
  3. Click + API Token and grant Audit Logs Read-only permission.
  4. Copy and save in a secure location the following details:
    • API Token
    • API Base URL: https://api.aware.work/external/system/auditlogs/v1

Configure AWS S3 bucket and IAM for Google SecOps

  1. Create Amazon S3 bucket following this user guide: Creating a bucket
  2. Save bucket Name and Region for future reference (for example, aware-audit-logs).
  3. Create a user following this user guide: Creating an IAM user.
  4. Select the created User.
  5. Select the Security credentials tab.
  6. Click Create Access Key in the Access Keys section.
  7. Select Third-party service as the Use case.
  8. Click Next.
  9. Optional: add a description tag.
  10. Click Create access key.
  11. Click Download CSV file to save the Access Key and Secret Access Key for later use.
  12. Click Done.
  13. Select the Permissions tab.
  14. Click Add permissions in the Permissions policies section.
  15. Select Add permissions.
  16. Select Attach policies directly
  17. Search for and select the AmazonS3FullAccess policy.
  18. Click Next.
  19. Click Add permissions.

Configure the IAM policy and role for S3 uploads

  1. In the AWS console, go to IAM > Policies > Create policy > JSON tab.
  2. Enter the following policy:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Sid": "AllowPutObjects",
          "Effect": "Allow",
          "Action": "s3:PutObject",
          "Resource": "arn:aws:s3:::aware-audit-logs/*"
        },
        {
          "Sid": "AllowGetStateObject",
          "Effect": "Allow",
          "Action": "s3:GetObject",
          "Resource": "arn:aws:s3:::aware-audit-logs/aware/state.json"
        }
      ]
    }
    
    • Replace aware-audit-logs if you entered a different bucket name.
  3. Click Next > Create policy.

  4. Go to IAM > Roles > Create role > AWS service > Lambda.

  5. Attach the newly created policy.

  6. Name the role AwareAuditLambdaRole and click Create role.

Create the Lambda function

  1. In the AWS Console, go to Lambda > Functions > Create function.
  2. Click Author from scratch.
  3. Provide the following configuration details:
Setting Value
Name aware-audit-poller
Runtime Python 3.13
Architecture x86_64
Execution role AwareAuditLambdaRole
  1. After the function is created, open the Code tab, delete the stub and enter the following code (aware-audit-poller.py):

    import boto3, gzip, io, json, os, time, urllib.parse
    import urllib.request
    from datetime import datetime, timedelta, timezone
    from botocore.exceptions import ClientError
    
    AWARE_ENDPOINT = "https://api.aware.work/external/system/auditlogs/v1"
    API_TOKEN = os.environ["AWARE_API_TOKEN"]
    BUCKET = os.environ["S3_BUCKET"]
    PREFIX = os.environ.get("S3_PREFIX", "aware/audit/")
    STATE_KEY = os.environ.get("STATE_KEY", "aware/state.json")
    MAX_PER_PAGE = int(os.environ.get("MAX_PER_PAGE", "500"))
    
    s3 = boto3.client("s3")
    
    def _load_state():
        try:
            obj = s3.get_object(Bucket=BUCKET, Key=STATE_KEY)
            return json.loads(obj["Body"].read().decode("utf-8"))
        except ClientError as e:
            if e.response.get("Error", {}).get("Code") == "NoSuchKey":
                return {}
            raise
    
    def _save_state(state):
        s3.put_object(Bucket=BUCKET, Key=STATE_KEY, Body=json.dumps(state).encode("utf-8"))
    
    def handler(event, context):
        tz_utc = timezone.utc
        now = datetime.now(tz=tz_utc)
    
        state = _load_state()
        start_date = (
            datetime.fromisoformat(state["last_date"]).date() if "last_date" in state
            else (now - timedelta(days=1)).date()
        )
        end_date = now.date()
    
        total = 0
        day = start_date
        while day <= end_date:
            day_str = day.strftime("%Y-%m-%d")
            params = {"filter": f"startDate:{day_str},endDate:{day_str}", "limit": str(MAX_PER_PAGE)}
            offset = 1
    
            out = io.BytesIO()
            gz = gzip.GzipFile(filename="aware_audit.jsonl", mode="wb", fileobj=out)
            wrote_any = False
    
            while True:
                q = urllib.parse.urlencode({**params, "offset": str(offset)})
                req = urllib.request.Request(f"{AWARE_ENDPOINT}?{q}")
                req.add_header("X-Aware-Api-Key", API_TOKEN)
                with urllib.request.urlopen(req, timeout=30) as resp:
                    payload = json.loads(resp.read().decode("utf-8"))
                items = (payload.get("value") or {}).get("auditLogData") or []
                if not items:
                    break
                for item in items:
                    gz.write((json.dumps(item, separators=(",", ":")) + "n").encode("utf-8"))
                    total += 1
                    wrote_any = True
                offset += 1
                time.sleep(0.2)
    
            gz.close()
            if wrote_any:
                key = f"{PREFIX}{day.strftime('%Y/%m/%d')}/aware_audit_{now.strftime('%Y%m%d_%H%M%S')}.jsonl.gz"
                s3.put_object(
                    Bucket=BUCKET,
                    Key=key,
                    Body=out.getvalue(),
                    ContentType="application/json",
                    ContentEncoding="gzip",
                )
    
            _save_state({"last_date": day.isoformat()})
            day += timedelta(days=1)
    
        return {"status": "ok", "written": total}
    
  2. Go to Configuration > Environment variables > Edit > Add new environment variable.

  3. Enter the following environment variables, replacing with your values:

    Key Example value
    S3_BUCKET aware-audit-logs
    S3_PREFIX aware/audit/
    STATE_KEY aware/state.json
    AWARE_API_TOKEN <your-aware-api-token>
    MAX_PER_PAGE 500
  4. After the function is created, stay on its page (or open Lambda > Functions > your-function**).

  5. Select the Configuration tab.

  6. In the General configuration panel click Edit.

  7. Change Timeout to 5 minutes (300 seconds) and click Save.

Create an EventBridge schedule

  1. Go to Amazon EventBridge > Scheduler > Create schedule.
  2. Provide the following configuration details:
    • Recurring schedule: Rate (1 hour).
    • Target: Your Lambda function aware-audit-poller.
    • Name: aware-audit-poller-1h.
  3. Click Create schedule.

Optional: Create read-only IAM user & keys for Google SecOps

  1. In the AWS Console. go to IAM > Users > Add users.
  2. Click Add users.
  3. Provide the following configuration details:
    • User: secops-reader.
    • Access type: Access key — Programmatic access.
  4. Click Create user.
  5. Attach minimal read policy (custom): Users > secops-reader > Permissions > Add permissions > Attach policies directly > Create policy.
  6. In the JSON editor, enter the following policy:

    {
      "Version": "2012-10-17",
      "Statement": [
        {
          "Effect": "Allow",
          "Action": ["s3:GetObject"],
          "Resource": "arn:aws:s3:::aware-audit-logs/*"
        },
        {
          "Effect": "Allow",
          "Action": ["s3:ListBucket"],
          "Resource": "arn:aws:s3:::aware-audit-logs"
        }
      ]
    }
    
  7. Set the name to secops-reader-policy.

  8. Go to Create policy > search/select > Next > Add permissions.

  9. Go to Security credentials > Access keys > Create access key.

  10. Download the CSV (these values are entered into the feed).

Configure a feed in Google SecOps to ingest Aware Audit logs

  1. Go to SIEM Settings > Feeds.
  2. Click + Add New Feed.
  3. In the Feed name field, enter a name for the feed (for example, Aware Audit logs).
  4. Select Amazon S3 V2 as the Source type.
  5. Select Aware Audit as the Log type.
  6. Click Next.
  7. Specify values for the following input parameters:
    • S3 URI: s3://aware-audit-logs/aware/audit/
    • Source deletion options: Select deletion option according to your preference.
    • Maximum File Age: Include files modified in the last number of days. Default is 180 days.
    • Access Key ID: User access key with access to the S3 bucket.
    • Secret Access Key: User secret key with access to the S3 bucket.
    • Asset namespace: The asset namespace.
    • Ingestion labels: The label applied to the events from this feed.
  8. Click Next.
  9. Review your new feed configuration in the Finalize screen, and then click Submit.

Need more help? Get answers from Community members and Google SecOps professionals.