>

Installing the Audit Logs app

This page provides the Audit Logs example of the Cloud Security Command Center (Cloud SCC) application package. The Audit Logs app can ingest Cloud Audit Logs logs through export sinks and create Cloud SCC security findings. The app also includes integration of Access Transparency alerts and Binary Authorization alerts for Blocked Deployments and Breakglass scenarios.

This guide was written for tools version 3.3.0. If you're using a different version, please see the README file included with the tools version you downloaded. As of May 22, 2019, the most recent release version is 4.0.1.

The Audit Logs app can create the following types of logs based on the log filters you select:

  • single: creates a Cloud SCC finding for each occurrence in the following categories:
    • Google Kubernetes Engine (GKE) Binary Authorization
    • Compute Engine
    • Cloud Storage
    • Service Networking
  • aggregated: groups findings within a Cloud Dataflow period and then creates a Cloud Identity and Access Management (Cloud IAM) finding.

The Audit Logs app creates the following types of findings based on the source type:

  • Cloud Audit Logs findings
  • Binary Authorization findings

Before you begin

Before you start this guide, you must complete the prerequisites and installation setup in Setting up Cloud SCC tools.

To install and run the Audit Logs package, you will also need the following:

  • An active GCP Organization
  • An active Cloud Billing account
  • The following Cloud IAM roles at the organization level:
    • Organization Administrator - roles/resourcemanager.organizationAdmin
    • Project Creator - roles/resourcemanager.projectCreator
    • Billing Account User - roles/billing.user
    • Viewer - roles/viewer
    • App Engine Admin - roles/appengine.appAdmin
    • Cloud Functions Developer - roles/cloudfunctions.developer
    • Cloud Scheduler Admin - roles/cloudscheduler.admin
    • Data Flow Admin - roles/dataflow.admin
    • Service Account Admin - roles/iam.serviceAccountAdmin
    • Service Account Key Admin - roles/iam.serviceAccountKeyAdmin
    • Service Account User - roles/iam.serviceAccountUser
    • API Keys Admin - roles/serviceusage.apiKeysAdmin
    • Pub/Sub Admin - roles/pubsub.admin
    • Project IAM Admin - roles/resourcemanager.projectIamAdmin
    • Service Usage Admin - roles/serviceusage.serviceUsageAdmin
    • Storage Admin - roles/storage.admin
    • Storage Object Admin - roles/storage.objectAdmin
    • Logging Admin - roles/loggingAdmin

Setting up environment variables

  1. Go to the Google Cloud Platform Console.
    Go to the GCP Console page
  2. Click Activate Cloud Shell.
  3. Run the following commands to set environment variables. Use the tools release version you downloaded during setup. This guide was written for tools version 3.3.0. For other tools versions, see the README included with the files you downloaded.

    # Release version you downloaded during setup
    export VERSION=[RELEASE_VERSION]
    
    # Directory to unzip the installation files
    export WORKING_DIR=${HOME}/scc-tools-install
    
    # Organzation ID where the script will run
    export ORGANIZATION_ID=[YOUR_ORG_ID]
    
    # Project ID to be created
    export AUDIT_LOGS_PROJECT_ID=[YOUR_AUDIT_LOGS_PROJECT_ID]
    
    # A valid billing account ID
    export BILLING=[YOUR_BILLING_ACCOUNT_ID]
    
    # A selected location from the App Engine Locations:
    # https://cloud.google.com/appengine/docs/locations) list
    export GAE_LOCATION=[YOUR_LOCATION]
    
  4. On the Cloud Shell menu bar, click Upload file on the More devshell settings menu.

  5. Upload the scc-audit-logs-${VERSION}.zip file you downloaded during the installation setup.

  6. Unzip the file you uploaded by running:

    unzip -qo scc-audit-logs-${VERSION}.zip -d ${WORKING_DIR}
    
  7. Go to the installation working directory:

    cd ${WORKING_DIR}
    

Installing the Audit Logs app package

In any of the following sections, you can simulate executions of the commands by using the option --simulation.

Step 1: Creating the project

Create the project in which you'll install the Creator app, and then link it to your billing account by running:

gcloud projects create ${AUDIT_LOGS_PROJECT_ID} \
  --organization ${ORGANIZATION_ID}

gcloud beta billing projects link ${AUDIT_LOGS_PROJECT_ID} \
  --billing-account ${BILLING}

Step 2: Enabling APIs

To enable the required Google APIs in the Audit Logs project, run:

gcloud services enable \
  cloudbuild.googleapis.com \
  cloudfunctions.googleapis.com \
  cloudresourcemanager.googleapis.com \
  compute.googleapis.com \
  datastore.googleapis.com \
  logging.googleapis.com \
  monitoring.googleapis.com \
  pubsub.googleapis.com \
  storage-component.googleapis.com \
  dataflow.googleapis.com \
  securitycenter.googleapis.com \
  --project ${AUDIT_LOGS_PROJECT_ID}

Step 3: Creating a Findings Editor service account

The Audit Logs app uses a service account to edit findings. This step requires the following Cloud IAM roles:

  • Organization Administrator - roles/resourcemanager.organizationAdmin
  • Security Center Admin - roles/securitycenter.admin
  • Service Account Admin - roles/iam.serviceAccountAdmin
  • Service Account Key Admin - roles/iam.serviceAccountKeyAdmin

These roles are necessary to grant the following roles to the service account:

  • Security Center Findings Editor - roles/securitycenter.findingsEditor
  • Security Center Sources Editor - roles/securitycenter.sourceEditor

Create the service account that will be used to deploy the application, download the key file, and grant the necessary roles by running:

  1. Create the Service Account:

    gcloud iam service-accounts create scc-finding-editor  \
     --display-name "SCC Finding Editor SA"  \
     --project ${AUDIT_LOGS_PROJECT_ID}
    
  2. Download the service account key file:

    (cd setup; \
    gcloud iam service-accounts keys create \
    service_accounts/scc-finding-editor-${AUDIT_LOGS_PROJECT_ID}-service-account.json \
    --iam-account scc-finding-editor@${AUDIT_LOGS_PROJECT_ID}.iam.gserviceaccount.com)
    
  3. Grant the Organization Level roles:

    gcloud beta organizations add-iam-policy-binding ${ORGANIZATION_ID} \
     --member="serviceAccount:scc-finding-editor@${AUDIT_LOGS_PROJECT_ID}.iam.gserviceaccount.com" \
     --role='roles/securitycenter.findingsEditor'
    
     gcloud beta organizations add-iam-policy-binding ${ORGANIZATION_ID} \
     --member="serviceAccount:scc-finding-editor@${AUDIT_LOGS_PROJECT_ID}.iam.gserviceaccount.com" \
     --role='roles/securitycenter.sourcesEditor'
    

Step 4: Enabling App Engine for the app

The Audit Logs app uses App Engine as its execution environment. To select a App Engine location and create an app to install to, run:

gcloud app create \
 --region  ${GAE_LOCATION} \
 --project ${AUDIT_LOGS_PROJECT_ID}

Note that you can't change the location after you set it.

Step 5: Creating an Organization Projects Browser service account

The Audit Logs app uses a service account to viewr projects in your organization to translate the Project ID in Project Number. This step requires the following Cloud IAM role:

  • Organization Administrator - roles/resourcemanager.organizationAdmin

This role is necessary to grant the following role to the service account:

  • Browser -roles/browser

Create the service account that will be used to deploy the application, download the key file, and grant the necessary roles by running:

  1. Create the Service Account:

     gcloud iam service-accounts create projects-browser  \
      --display-name "Projects Browser SA"  \
      --project ${AUDIT_LOGS_PROJECT_ID}
    
  2. Download the service account key file:

    (cd setup; \
    gcloud iam service-accounts keys create \
    service_accounts/projects-browser-${AUDIT_LOGS_PROJECT_ID}-service-account.json \
    --iam-account projects-browser@${AUDIT_LOGS_PROJECT_ID}.iam.gserviceaccount.com)
    
  3. Grant the Organization Level roles:

    gcloud beta organizations add-iam-policy-binding ${ORGANIZATION_ID} \
    --member="serviceAccount:projects-browser@${AUDIT_LOGS_PROJECT_ID}.iam.gserviceaccount.com" \
    --role='roles/browser'
    

Step 6: Creating an API key

Before you run the Audit Logs app setup, Create an API key and restrict it to the Cloud SCC application, then export its value to an environment variable:

  1. Go to the APIs & Services > Credentials page in the GCP Console.
    Go to the Credentials page
  2. On the Create credentials drop-down list, click API key.
  3. On the API key created dialog that appears, copy your API key.
  4. Export the API key as an environment variable by running:

    export API_KEY=[YOUR_API_KEY]
    

Step 7: Getting Cloud SCC source IDs

The Audit Logs app uses a Cloud SCC security source to create findings from Cloud Audit Logs and Binary Authorization. In this step you will create new Source IDs or activate existing source IDs.

If you have already created Sources for Cloud Audit Logs and Binary Authorization, run the following commands to get the Source IDs:

export SCC_SA_FILE=setup/service_accounts/scc-finding-editor-${AUDIT_LOGS_PROJECT_ID}-service-account.json;
export URL=https://securitycenter.googleapis.com/v1beta1/organizations/${ORGANIZATION_ID}/sources?pageSize=100

gcloud auth activate-service-account --key-file=${SCC_SA_FILE}

export ACCESS_TOKEN=$(gcloud auth print-access-token)

curl -H "Authorization: OAuth ${ACCESS_TOKEN}" ${URL}

When you're finished, change the activated service account back to your user.

If you haven't created Sources for Cloud Audit Logs and Binary Authorization, run the following commands to create them:


export SCC_SA_FILE=setup/service_accounts/scc-finding-editor-${AUDIT_LOGS_PROJECT_ID}-service-account.json;
export URL="https://securitycenter.googleapis.com/v1beta1/organizations/${ORGANIZATION_ID}/sources"

gcloud auth activate-service-account --key-file=${SCC_SA_FILE}

export ACCESS_TOKEN=$(gcloud auth print-access-token)

export DATA="{display_name: 'Audit Logs', description: 'Audit Logs'}"

curl -H "Authorization: OAuth ${ACCESS_TOKEN}" \
  -H "Content-Type: application/json" \
  --request POST \
  --data "${DATA}" \
  ${URL}

export DATA="{display_name: 'Binary Authorization', description: 'Binary Authorization'}"

curl -H "Authorization: OAuth ${ACCESS_TOKEN}" \
  -H "Content-Type: application/json" \
  --request POST \
  --data "${DATA}" \
  ${URL}

When you're finished, change the activated service account back to your user.

Following is an example of a Cloud SCC Source object returned by the Cloud SCC API:

{
  "name": "organizations/<your-organization-id>/sources/<the-scc-source-id>",
  "displayName": "Audit Logs",
  "description": "Audit Logs"
}

Note the <the-scc-source-id> number for both Cloud Audit Logs and Binary Authorization. You'll use these in the next steps.

Step 8: Deploy the application

Set the environment variables you'll use in the installation:

# one region from the Google Storage bucket locations for a REGIONAL bucket
# from: https://cloud.google.com/storage/docs/bucket-locations
export BUCKET_REGION=[YOUR_CLOUD_FUNCTION_BUCKET_REGION]

# API Key to access the Cloud SCC API from your Cloud SCC enabled project
# from: https://console.cloud.google.com/apis/credentials
export SCC_API_KEY=[YOUR_SCC_API_KEY]

# Absolute path to the Service Account file for the Cloud SCC API Project
export SCC_SA_FILE=[ABSOLUTE_PATH_TO_SERVICE_ACCOUNT_FILE]

# Absolute path to the Service Account file for organization wide project
# browser role
export ORG_BROWSER_SA_FILE=[ABSOLUTE_PATH_TO_SERVICE_ACCOUNT_FILE]

# Cloud Source ID number to create findings for Cloud Audit Logging
export SCC_AUDIT_LOG_SOURCE_ID=[SCC_AUDIT_LOG_SOURCE_ID]

# Cloud Source ID number to create findings for Binary Authorization
export SCC_BINARY_AUTHORIZATION_SOURCE_ID=[SCC_BINARY_AUTHORIZATION_SOURCE_ID]

To create the remaining infrastructure, Cloud Pub/Sub topics, and deploy the application, run the commands below. If you want to run a simulation of the execution, use --simulation instead of --no-simulation. If you run the commands in simulation mode, a message might display BucketNotFoundException: 404. This is an expected scenario and the bucket will be created when running in non simulation mode.

Note that the default Cloud Dataflow time window set for this app is 60 minutes. To change this value, use the option --df_window [MINUTES] where [MINUTES] is the number of minutes you want to use for the time window.

(cd scc-logs/setup; \
pipenv --python 3.5.3; \
pipenv install --ignore-pipfile; \
pipenv run python3 run_setup.py \
  --organization_id ${ORGANIZATION_ID} \
  --project ${SCC_AUDIT_LOGS_PROJECT_ID} \
  --bucket_region ${BUCKET_REGION} \
  --scc_api_key ${SCC_API_KEY} \
  --scc_sa_file ${SCC_SA_FILE} \
  --org_browser_sa_file ${ORG_BROWSER_SA_FILE} \
  --audit_logs_source_id ${SCC_AUDIT_LOG_SOURCE_ID} \
  --binary_authorization_source_id ${SCC_BINARY_AUTHORIZATION_SOURCE_ID} \
  --no-simulation)

It can take 5 minutes or longer for a log entry to be added as finding to Cloud SCC.

Verifying the installation

To verify the Audit Logs app installation, use the GCP Console to check the following GCP Console pages:

  • Cloud Functions page, for the following triggers:
    • Topic: findings_to_save
    • Topic: log_sink_simple
  • Pub/Sub page for the following topic names:
    • findings_to_save
    • log_sink_aggregated
    • log_sink_single
    • topic_aggregated_findings
    • topic_single_findings
  • App Engine Services page for the service by binary-control-clsecteam.
  • Dataflow page for the Cloud Dataflow pipeline.

To verify that sinks were created, run the following:

  1. Verify that you're logged in by running:

    gcloud auth list
    

    If you aren't the active user, follow the instructions in the command result to log in.

  2. Print sinks in the organization:

    gcloud logging sinks list --organization=${ORGANIZATION_ID}
    

The sinks in the organization will be printed and you can verify if their destinations match the topic from your project.

To change the sink destination, run the following:

# The name of the sink for which you want to change the destination
export SINK_NAME=[SINK_TO_CHANGE]

# The full path of the topic in your project that the sink should send messages
# to. For example pubsub.googleapis.com/projects/<project_name>/topics/<topic_name>
export DESTINATION_TOPIC=[DESTINATION_TOPIC_FOR_THE_SINK]

# Update the sink destination
gcloud logging sinks update ${SINK_NAME} ${DESTINATION_TOPIC} --organization ${ORGANIZATION_ID}

Using the Audit Logs app

After you install, configure, and verify the Audit Logs app, every entry found that matches the configured sink filter will create a single or aggregated finding in Cloud SCC.

You can check and change the configured filters with the list and update commands in the section above. Depending on the configured destination, findings will be sent to Cloud SCC as single entries, like creating a bucket, or as aggregated entries, like changing Cloud IAM rules.

Single workflow

  1. Create a bucket.
  2. The single sink with the filter that matches this change will publish a message in the destination topic.
  3. The Cloud Functions log-single-converter will read that message and convert it to a value that Cloud SCC can interpret.
    1. After converting the message, it will be published to another topic.
  4. Periodically, App Engine will run and fetch the messages from that topic and call the finding-scc-writer Cloud Functions function. By default, this runs every 5 minutes.
    1. The function will publish the message to Cloud SCC.
    2. The App Engine application will throttle messages so it doesn't overload the Cloud SCC API. This is set to 10 messages per second by default.

Aggregated workflow

  1. Add a Cloud IAM rule and save it, then replace an Cloud IAM rule with a different one and save it.
  2. The aggregated sink with the filter that matches this change will publish a message in the destination topic.
  3. Periodically, Cloud Dataflow will run and group the messages from the topic and send them to Cloud SCC. By default, this runs every 60 minutes.

Cleanup and restore Logging flow

Run the script below to avoid being charged for the created resources, or to restore a previous running environment by:

  • Delete/Recreate Cloud Pub/Sub subscriptions
  • Delete/Recreate Cloud Functions functions
  • Cancel/Recreate Cloud Dataflow jobs

This script can be used to cleanup or restore the Logging flow.

Set the environment variables required by the installation scripts:

# id of the project hosting scc audit logs application
export SCC_LOGS_PROJECT_ID=[YOUR_SCC_AUDIT_LOGS_PROJECT_ID]

# the organization id where these scripts will run
export ORGANIZATION_ID=[YOUR_ORG_ID]

# one region from the link below for REGIONAL bucket
# [Google storage bucket locations](https://cloud.google.com/storage/docs/bucket-locations)
export BUCKET_REGION=[YOUR_CLOUD_FUNCTION_BUCKET_REGION]

# API Key to access SCC API from your SCC enabled project
# [Google Credentials](https://console.cloud.google.com/apis/credentials)
export SCC_API_KEY=[YOUR_SCC_API_KEY]

# Absolute path to the Service Account file for the Security Command Center API Project
export SCC_SA_FILE=[ABSOLUTE_PATH_TO_SERVICE_ACCOUNT_FILE]

# Absolute path to the Service Account file for organization wide project browser role.
export ORG_BROWSER_SA_FILE=[ABSOLUTE_PATH_TO_SERVICE_ACCOUNT_FILE]

# source ID to create the findings from
export AUDIT_LOGS_SOURCE_ID=[SOURCE_ID_FROM_FINDINGS]

Cleanup

(cd scc-logs/setup; \
pipenv run python3 run_audit_logs_cleanup.py cleanup_chain \
  --project ${SCC_LOGS_PROJECT_ID} --no-simulation)

Restore Logging flow

(cd scc-logs/setup; \
pipenv run python3 run_audit_logs_cleanup.py restore_chain \
  --organization_id ${ORGANIZATION_ID} \
  --project ${SCC_LOGS_PROJECT_ID} \
  --bucket_region ${BUCKET_REGION} \
  --scc_api_key ${SCC_API_KEY} \
  --scc_sa_file ${SCC_SA_FILE} \
  --org_browser_sa_file ${ORG_BROWSER_SA_FILE} \
  --audit_logs_source_id ${AUDIT_LOGS_SOURCE_ID} \
  --no-simulation)

Changing the finding creation frequency

The Throttler component is a App Engine application that controls the finding creation flow for Cloud SCC single and aggregated audit logs. The default frequency is set to 5 minutes.

To change the default frequency value for Throttler, change the schedule parameter in throttler/cron.yaml as described in the cron.yaml Refernece

cron:
- description: "Pull and forward findings"
  url: /pull-forward-findings
  target: default
  schedule: every 5 minutes

To apply a change to the default frequency, run the following:

# id of the project hosting scc audit logs application
export SCC_LOGS_PROJECT_ID=[YOUR_SCC_AUDIT_LOGS_PROJECT_ID]

(cd scc-logs/throttler; \
gcloud app deploy cron.yaml --quiet --project ${SCC_LOGS_PROJECT_ID})

To force Throttler execution, access the URL generated by the command below and then click Run now:

# id of the project hosting scc audit logs application
export SCC_LOGS_PROJECT_ID=[YOUR_SCC_AUDIT_LOGS_PROJECT_ID]

# the organization id
export ORGANIZATION_ID=[YOUR_ORG_ID]

# cron jobs base url
export BASE_URL=https://console.cloud.google.com/appengine/cronjobs

# prints the throttler cron overview url
echo "${BASE_URL}?organizationId=${ORGANIZATION_ID}&project=${SCC_LOGS_PROJECT_ID}"

For more information, see Scheduling Tasks with Cron for Python.

Hai trovato utile questa pagina? Facci sapere cosa ne pensi:

Invia feedback per...

Cloud Security Command Center
Hai bisogno di assistenza? Visita la nostra pagina di assistenza.