Data Export API

The Chronicle Data Export API enables customers to export their data from their Chronicle account to their Google Cloud Storage buckets and manage existing export requests.

Note: Before exporting data from Chronicle, customers must create their own Google Cloud Storage bucket (make sure that the bucket is not publicly accessible) and grant malachite-data-export-batch@prod.google.com the following roles for that Google Cloud Storage bucket:

Use the Console or command-line tool to issue the following command:

gsutil iam ch user:malachite-data-export-batch@prod.google.com:objectAdmin,legacyBucketReader gs://<your-bucket-name>

How to authenticate with the Chronicle API

This Chronicle API uses the OAuth 2.0 protocol for authentication and authorization. Your application can complete these tasks using either of the following implementations:

  • Using the Google API client library for your computer language.

  • Directly interfacing with the OAuth 2.0 system using HTTP.

See the reference documentation for the Google Authentication library in Python.

Google Authentication libraries are a subset of the Google API client libraries. See other language implementations.

Getting API authentication credentials

Your Chronicle representative will provide you with a Google Developer Service Account Credential to enable the API client to communicate with the API.

You also need to provide the Auth Scope when initializing your API client. OAuth 2.0 uses a scope to limit an application's access to an account. When an application requests a scope, the access token issued to the application is limited to the scope granted.

Use the following scope to initialize your Google API client:

https://www.googleapis.com/auth/chronicle-backstory

Python example

The following Python example demonstrates how to use the OAuth2 credentials and HTTP client using google.oauth2 and googleapiclient.

# Imports required for the sample - Google Auth and API Client Library Imports.
# Get these packages from https://pypi.org/project/google-api-python-client/ or run $ pip
# install google-api-python-client from your terminal
from google.oauth2 import service_account
from googleapiclient import _auth

SCOPES = ['https://www.googleapis.com/auth/chronicle-backstory']

# The apikeys-demo.json file contains the customer's OAuth 2 credentials.
# SERVICE_ACCOUNT_FILE is the full path to the apikeys-demo.json file
# ToDo: Replace this with the full path to your OAuth2 credentials
SERVICE_ACCOUNT_FILE = '/customer-keys/apikeys-demo.json'

# Create a credential using Google Developer Service Account Credential and Chronicle API
# Scope.
credentials = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)

# Build an HTTP client to make authorized OAuth requests.
http_client = _auth.authorized_http(credentials)

# <your code continues here>

Data Export API reference

The following sections describe the Chronicle Data Export API methods.

Note: All requests must be made using authenticated Google API client libraries as described in

How to Authenticate with the Chronicle API? All responses are provided in JSON.

CreateDataExport

Creates a new data export.

Note: CreateDataExport uses the POST method.

Request

Request Body
{
  "startTime": "Start, inclusive time from the time range",
  "endTime": "Last, exclusive time from the time range",
  "logType": "An individual log type or 'ALL_TYPES' for all log types",
  "gcsBucket": "Path to the customer-provided Google Cloud Storage bucket in projects/<project-id>/buckets/<bucket-name>" format,
}
Parameters
Parameter Name Type Description
startTime google.protobuf.Timestamp (Optional): Start, inclusive time from the time range.

If not specified, the value is UNIX epoch time starting on January 1st, 1970 at UTC.

endTime google.protobuf.Timestamp (Optional): Last, exclusive time from the time range.

If not specified, the value is the current timestamp.

logType string (Required): Individual log type or ALL_TYPES for all log types.
gcsBucket string (Required): Path to the customer-provided Google Cloud Storage bucket in: \ projects/<project-id>/buckets/ \ <bucket-name>" format
Sample Request
https://backstory.googleapis.com/v1/tools/dataexport
{
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "UDM",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket"
}
Sample Response
{
  "dataExportId": "d828bcec-21d3-4ecd-910e-0a934f0bd074",
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "UDM",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "dataExportStatus": {"stage": "IN_QUEUE"}
}

GetDataExport

Returns an existing data export.

Note: GetDataExport uses the GET method.

Request

https://backstory.googleapis.com/v1/tools/dataexport/{data_export_id}
Request Body
{
 "dataExportId": "The UUID representing the data export request"
}
Parameters
Parameter Name Type Description
dataExportId string UUID representing the data export request.
Sample Request
https://backstory.googleapis.com/v2/dataexport/{data_export_id}
Sample Response
{
  "dataExportId": "d828bcec-21d3-4ecd-910e-0a934f0bd074",
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "UDM",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "dataExportStatus": {"stage": "IN_QUEUE"}
}

CancelDataExport

Cancels an existing data export request.

Note: CancelDataExport uses the POST method. Only IN_QUEUE data exports can be canceled.

Request

https://backstory.googleapis.com/v1/tools/dataexport/{data_export_id}:cancel
Request Body
{
  "dataExportId": "The UUID representing the data export request to be canceled"
}
Parameters
Parameter Name Type Description
dataExportId string UUID representing the data export request to be canceled.
Sample Request
https://backstory.googleapis.com/v1/tools/dataexport/d828bcec-21d3-4ecd-910e-0a934f0bd074:cancel
Sample Response
{
  "dataExportId": "d828bcec-21d3-4ecd-910e-0a934f0bd074",
  "startTime": "2020-03-01T00:00:00Z",
  "endTime": "2020-03-15T00:00:00Z",
  "logType": "UDM",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "dataExportStatus": {"stage": "CANCELLED"}
}

ListAvailableLogTypes

List all available log types and their time range.

Note: ListAvailableLogTypes uses the GET method.

Request

https://backstory.googleapis.com/v1/tools/dataexport/listavailablelogtypes 
Request Body
{
 "startTime": "Start, inclusive time from the time range",
 "endTime": "Last, exclusive time from the time range"
}
Parameters
Parameter Name Type Description
startTime google.protobuf.Timestamp (Optional): Start, inclusive time from the time range.

If not specified, the value is UNIX epoch time starting on January 1st, 1970 at UTC.

startTime google.protobuf.Timestamp (Optional): Last, exclusive time from the time range.

If not specified, the value is the current timestamp.

Sample Request
https://backstory.googleapis.com/v1/tools/dataexport/listavailablelogtypes
{
 "startTime": "2020-01-01T00:00:00Z",
 "endTime": "2021-01-01T00:00:00Z"
}
Sample Response
{
'availableLogTypes': [{'logType': 'ACALVIO', 'startTime': '2020-03-02T02:00:00Z', 'endTime': '2020-08-02T11:00:00Z'}, {'logType': 'AZURE_AD', 'startTime': '2020-02-10T22:00:00Z', 'endTime': '2020-02-13T02:00:00Z'}]
}