Data Export API (Enhanced)

Supported in:

Important: After the new enhanced API is enabled, you can't use the API to access your old, existing jobs.

The Data Export API is a robust, high-performance service designed to facilitate the bulk export of your security data from Google Security Operations to a Google Cloud Storage bucket that you control. This feature addresses the critical need for long-term data retention, supporting strict compliance requirements (such as SOX, HIPAA, and GDPR) and historical forensic analysis.

This Data Export API is engineered to be a scalable and reliable solution for point-in-time data exports, handling requests of up to 100 TB. It functions as a managed pipeline, providing essential enterprise-grade features, such as automated retries on transient errors, comprehensive job status monitoring, and a full audit trail for each export job. Exported data is logically partitioned by date and time within your Google Cloud Storage bucket.

This feature lets you build large-scale, data offloading workflows while Google SecOps manages the export process complexity, leading to stability and performance.

Key benefits

The Data Export API provides a resilient and auditable solution for managing the lifecycle of your security data.

  • Reliability: The service is built to handle large-scale data transfers. Export jobs that encounter transient issues, such as temporary network problems, are automatically retried using an exponential backoff strategy. The system is designed to be resilient. If your export job fails and it encounters a transient error, it automatically retries a job several times. If the job fails permanently after all retries, its status updates to FINISHED_FAILURE, and the API response for that job contains a detailed error message explaining the cause.
  • Comprehensive auditability: To meet strict compliance and security standards, every action related to an export job is captured in an immutable audit trail. This trail includes the creation, start, success, or failure of every job, along with the user who initiated the action, a timestamp, and the job parameters.

  • Optimized for performance and scale: The API is underpinned by a robust job management system that includes queuing and prioritization to ensure platform stability and prevent any single tenant from monopolizing resources.

  • Enhanced data integrity and accessibility: Data is automatically organized into a logical directory structure within your Google Cloud Storage bucket, helping to locate and query specific time windows for historical analysis.

Key terms and concepts

  • Export job: A single, asynchronous operation to export a specific time range of log data to a Google Cloud Storage bucket. Each job is tracked with a unique dataExportId.
  • Job status: The current state of an export job in its lifecycle (for example, IN_QUEUE, PROCESSING, FINISHED_SUCCESS).
  • Google Cloud Storage bucket: A user-owned Google Cloud Storage bucket that serves as the destination for the exported data.
  • Log types: The specific categories of logs to export (for example, NIX_SYSTEM, WINDOWS_DNS, CB_EDR). For more details, see the list of all supported log types.

Understand the exported data structure

When a job completes successfully, the data is written to your Google Cloud Storage bucket with a specific, partitioned directory structure to facilitate easier access and querying.

Directory path structure: gs://<gcs-bucket-name>/<export-job-name>/<logtype>/<event-time-bucket>/<epoch_execution_time>/<file-shard-name>.csv

  • gcs-bucket-name: The name of your Google Cloud Storage bucket.
  • export-job-name: The unique name of your export job.
  • logtype: The name of the log type for the exported data.
  • event-time-bucket: The hour range of the event timestamps of exported logs. It's represented as a UTC-timestamp in the format: year/month/day/UTC-timestamp. For example, 2025/08/25/01/00/00 refers to UTC 01:00:00 AM August 25, 2025.
  • epoch-execution-time: The Unix epoch time value when the export job was initiated.
  • file-shard-name: The name of the sharded files containing raw logs. Each file is bound by an upper file size limit of 100 MB.

Performance and limitations

The service is designed with specific limits to ensure platform stability and fair resource allocation.

  • Maximum data volume per job: Each individual export job can request up to 100 TB of data. For larger datasets, we recommend to break the export into multiple jobs with smaller time ranges.
  • Concurrent jobs: Each customer tenant can run or queue a maximum of 3 export jobs concurrently. Any new job creation request that exceeds this limit will be rejected.
  • Job completion times: Job completion times vary based on the volume of data exported. A single job can take up to 18 hours.
  • Export format and data scope: This API is focused on bulk, point-in-time exports, with the following capabilities:

Prerequisites and architecture

This section outlines the necessary requirements for using the Data Export API and details the system architecture. Reviewing these components confirms that your environment is correctly set up.

Before you begin

Before you can use the Data Export API, you must complete a few prerequisite steps to set up your Google Cloud Storage destination and grant the necessary permissions.

  1. Grant permissions to the API user: You must have the following IAM roles to use the Data Export API.

    • Chronicle administrator (creating/managing jobs): Grants full permissions to create, update, cancel, and view export jobs using the API.
    • Chronicle Viewer: Grants read-only access to view job configurations and history using the API.
  2. Create a Google Cloud Storage bucket: In your Google Cloud project, create a new Google Cloud Storage bucket (the destination for your exported data) in the same region as your Google SecOps tenant, and make it private to prevent unauthorized access. For details, see Create a bucket.

  3. Grant permissions to the Service Account: You must grant the Google SecOps Service Account linked to your Google SecOps tenant, the necessary IAM roles to write data to your bucket.

    1. Call the FetchServiceAccountForDataExport API endpoint to identify the unique service account for your Google SecOps instance. The API returns the Service Account email.

      Example request:

      {
        "parent": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee"
      }
      

      Example Response:

      {
        "service_account_email": "service-1234@gcp-sa-chronicle.iam.gserviceaccount.com"
      }
      
    2. Grant the Google SecOps Service Account principal the following IAM roles on the destination Google Cloud Storage bucket. This role grant lets the Google SecOps service to write the exported data files into your Google Cloud Storage bucket.

      • Storage object administrator (roles/storage.objectAdmin)
      • Legacy bucket reader (roles/storage.legacyBucketReader)

      For details, see Grant access to the Google SecOps Service Account.

  4. Complete authentication: The Data Export API authenticates the calls you send. To set up this authentication, follow the instructions in the following sections:

    1. Authentication methods for Google Cloud services
    2. Application default credentials

Key use cases

The Data Export API provides a suite of endpoints to create data export jobs and manage the entire lifecycle of bulk data export. All interactions are performed using API calls.

The following use cases describe how to create, monitor, and manage data export jobs.

Core workflow

This section explains how to manage the lifecycle of your export jobs.

Create a new data export job

Data export job specifications are stored on the parent resource Google SecOps instance, which is the location of the source log data for the export job.

To start a new export, send a POST request to the dataExports.create endpoint. For details, see CreateDataExport endpoint.

Monitor data export job status

View data export job details and job status for a specific export job, or set a filter to view certain types of jobs.

  • To view a specific export job, see GetDataExport for details.

  • To list certain types of data export jobs according to a filter, see ListDataExport for details.

Manage queued jobs

You can modify or cancel a job while in the IN_QUEUE status.

  • For details on how to change parameters (such as the time range, list of log types, or the destination bucket), see UpdateDataExport.

  • For details on how to cancel a queued job, see CancelDataExport.

Data Export API reference

Once the prerequisites are fulfilled, you can begin using Data Export APIs.

The following sections describe the Chronicle Data Export API endpoints.

CreateDataExport

Use this endpoint to create a specification for a bulk data export job. The job specification is stored on the parent resource — the Google SecOps instance containing the source log data.

Data is exported according to the First-In, First-Out (FIFO) principle, independent of data size.

For details, see Method: dataExports.create

Request

Endpoint: POST https://chronicle.googleapis.com/v1alpha/{parent}/dataExports

Path parameters
Field Type Required Description
parent string required The Google SecOps instance where the data is exported from, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
Request body

Post a request to the endpoint using the following parameters.

Sample request
{
  "parent": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
  "dataExport": {
    "startTime": "2025-08-01T00:00:00Z",
    "endTime": "2025-08-02T00:00:00Z",
    "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
    "includeLogTypes": [
      "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
      "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
    ]
  }
}
Body parameters
Field Type Required Description
parent string required The Google SecOps instance where the data is exported from, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
dataExport object required The specification of the log event data to export, and the destination bucket.
startTime string optional The beginning of the event time range of the data to export, based on the event timestamp.
Format: String in google.protobuf.Timestamp format.
If not specified, this defaults to 01/01/1970 UTC.
endTime string optional The end of the event time range of the data to export.
Format: String in google.protobuf.Timestamp format.
If not specified, this defaults to the current time.
gcsBucket string required The path to your Google Cloud Storage destination bucket, specified in the following format:
/projects/{project-id}/buckets/{bucket-name}.
Note: The destination bucket must be in the same region as the source Google SecOps tenant.
includeLogTypes array optional A comma-separated array of one or more log types you want to export. If not specified, all log types are exported by default.
Sample response

Upon successful creation of the data export job, the API returns a unique name for the data export job, and the job's initial status, which is IN_QUEUE. The response also includes an estimatedVolume of the data expected to be exported.

{
  "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
  "startTime": "2025-08-01T00:00:00Z",
  "endTime": "2025-08-02T00:00:00Z",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "includeLogTypes": [
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
  ],
  "dataExportStatus": {
    "stage": "IN_QUEUE"
  },
  "estimatedVolume": "10737418240",
  "createTime": "2025-08-13T11:00:00Z",
  "updateTime": "2025-08-13T11:00:00Z"
}

Response parameters
Parameter Type Description
name string Unique data export job ID.
The dataExportId is extracted from the last section of the name parameter. It is the UUID used in other calls to represent the data export request.
startTime string Starting time range.
Format: String in google.protobuf.Timestamp format.
endTime string Ending time range.
Format: String in google.protobuf.Timestamp format.
gcsBucket string The path to your Google Cloud Storage destination bucket, specified in the following format:
/projects/{project-id}/buckets/{bucket-name}.
includeLogTypes list A comma-separated list of log types included.
dataExportStatus.stage string The status of the export job at the time of creation (always IN_QUEUE).
estimatedVolume string The estimated export volume in bytes.
createTime string Job creation time.
Format: String in google.protobuf.Timestamp format.
updateTime string Job update time.
Format: String in google.protobuf.Timestamp format.

GetDataExport

Retrieves the current status and details of a specific data export job, using its dataExportId.

Request

Endpoint: GET https://chronicle.{region}.rep.googleapis.com/v1alpha/{name}

Path parameters
Field Type Required Description
name string required The name of the data export job to retrieve, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}/dataexports/{dataExportId}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
{dataExportId}: UUID identifier of the data export job.
Sample request
{
  "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d"
}
Request body

The request body must be empty.

Sample response

The response will contain the full job details, including its current status. For completed jobs, the response will also return the actual volume of data successfully exported.

{
  "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
  "startTime": "2025-08-01T00:00:00Z",
  "endTime": "2025-08-02T00:00:00Z",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "includeLogTypes": [
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
  ],
  "dataExportStatus": {
    "stage": "FINISHED_SUCCESS",
    "exportedGlobPatterns": [
      "/bigstore/<bucket>/<dataexportid>/exported_paths.txt"
    ]
  },
  "estimatedVolume": "10737418240",
  "exportedVolume": "10938428241",
  "createTime": "2025-08-13T11:00:00Z",
  "updateTime": "2025-08-13T11:05:00Z"
}
Response parameters
Parameter Type Description
name string Unique name for a data export job.
startTime string Starting time range.
endTime string Ending time range.
gcsBucket string The path to your Google Cloud Storage destination bucket, specified in the following format:
/projects/{project-id}/buckets/{bucket-name}.
includeLogTypes list A comma-separated list of included log types.
dataExportStatus.stage string Current status of the data export job, which can have one of the following values:
  • IN_QUEUE: The job has been accepted and is waiting for resources to become available.
  • PROCESSING: The job is actively being executed.
  • FINISHED_SUCCESS: The job completed successfully. The response will include the final exportedVolume.
  • FINISHED_FAILURE: The job failed after all retry attempts. The response will include detailed error information.
  • CANCELLED: The job was cancelled by a user before it started processing.

(For more details, see the DataExportStatus reference.)
dataExportStatus.exportedGlobPatterns list File path of the exported text file, containing a list of all the exported file shards (exported glob patterns) created in the destination bucket.
estimatedVolume string The estimated export volume in bytes.
exportedVolume string For completed jobs, the actual volume of data exported.
createTime string Job creation time.
updateTime string Job update time.

ListDataExport

List data export jobs associated with a Google SecOps instance. You can optionally filter the list to narrow the results, by the creation time of the export job, the job name, and the current job status.

Request

HTTP Method: LIST

Endpoint: LIST https://chronicle.googleapis.com/v1alpha/{parent}/dataExports

Path parameters
Field Type Required Description
parent string required The Google SecOps instance whose data export requests are to be listed, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
Request body

Send a LIST request, using optional filters to narrow the results, for example, the createTime of the data export job, the job name, and the current job status, dataExportStatus.stage.

Sample request
{
  "parent": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee",
  "pageSize": 2,
  "filter": 
    "dataExportStatus.stage": 
      (("FINISHED_SUCCESS" OR "CANCELLED") AND
      (createTime>="2025-08-29T00:00:00Z") AND
      (createTime<="2025-09-09T00:00:00Z") AND
      (name="projects/140410331797/locations/us/instances/ebdc4bb9-878b-11e7-8455-10604b7cb5c1/dataExports/ed3f735d-3347-439a-9161-1d474407eae2"))
}
Request Parameters
Field Type Required Description
parent string required The Google SecOps instance whose Data Export requests are to be listed, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
pageSize integer optional The maximum number of export jobs to return. The response may return fewer results. If unspecified, by default a list of 10 jobs will be returned. The maximum number of jobs that can be returned in a single request is 100.
pageToken string optional A string value returned in a paginated response, which can be used to retrieve the subsequent page.
filter string optional Filters you can apply to the list of jobs returned:
  • "dataExportStatus.stage:{list of log types}": A list of job status values separated by the OR operator, to be included in the list request.
  • "createTime >= {timestamp}": The start of the time range to be included, based on creationTime.
  • "createTime <= {timestamp}": The end of the time range to be included, based on creationTime.
  • "name = {list of names}": A list of job names separated by the OR operator, to be included in the list request.
Sample response

The response returns a paginated array of data export job objects that match the filter criteria. Each object contains full job details and the current job status. Completed jobs include the actual volume of data successfully exported.

{
  "dataExports": [
    {
      "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
      "startTime": "2025-08-01T00:00:00Z",
      "endTime": "2025-08-03T00:00:00Z",
      "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
      "includeLogTypes": [
        "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
        "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
      ],
      "dataExportStatus": {
        "stage": "CANCELLED"
      },
      "estimatedVolume": "10737418240",
      "createTime": "2025-08-01T11:00:00Z",
      "updateTime": "2025-08-13T11:10:00Z"
    },
    {
      "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/f1e2d3c4-b5a6-7890-1234-567890abcdef",
      "startTime": "2025-08-03T00:00:00Z",
      "endTime": "2025-08-04T00:00:00Z",
      "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
      "dataExportStatus": {
        "stage": "FINISHED_SUCCESS",
        "exportedGlobPatterns": [
          "/bigstore/<bucket>/<dataexportid>/exported_paths.txt"
        ]
      },
      "estimatedVolume": "53687091200",
      "exportedVolume": "54687091205",
      "createTime": "2025-08-01T09:00:00Z",
      "updateTime": "2025-08-13T10:30:00Z"
    }
  ],
  "nextPageToken": "aecg2S1w"
}
Response parameters
Parameter Type Description
dataExports array Array of data export job objects that match the specified filters:
  • name: Unique data export job ID.
  • startTime: Starting time range.
  • endTime: Ending time range.
  • gcsBucket: The path to your Google Cloud Storage destination bucket, specified in the following format:
    /projects/{project-id}/buckets/{bucket-name}.
  • includeLogTypes: A comma-separated list of included log types.
  • dataExportStatus.stage: Current status of the data export job.
  • dataExportStatus.exportedGlobPatterns: File path of the exported text file, containing a list of all the exported file shards created in the destination bucket.
  • estimatedVolume: The estimated export volume in bytes.
  • exportedVolume: For completed jobs, the actual volume of data exported.
  • createTime: Job creation time.
  • updateTime: Job update time.
nextPageToken string A token (string) used to retrieve the subsequent page in a different request.

UpdateDataExport

You can only modify the parameters of an existing job in IN_QUEUE status.

Request

Endpoint: PATCH https://chronicle.{region}.rep.googleapis.com/v1alpha/{parent}/dataExports/{dataExportId}

Path parameters
Field Type Required Description
parent string required The parent resource where this data export specification is located, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
dataExportId string required The job ID to be updated.
Request body

Send a PATCH request specifying the job's name, and include an update_mask to indicate which fields you are changing.

Sample request - update_mask
{
  "dataExport": {
    "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
    "endTime": "2025-08-03T00:00:00Z",
    "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket2"
  }
}
Body parameters
Field Type Required Description
dataExportId string required The data export ID to update.
name string required Unique name of the data export job to update, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}/dataexports/{dataExportId}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
{dataExportId}: UUID identifier of the data export job.
dataExport object required The specification of the collection of log event data to export.
startTime google.protobuf.
Timestamp
optional The updated starting value of the time range for the export.
endTime google.protobuf.
Timestamp
optional The updated ending value of the time range for the export.
gcsBucket string optional The updated path to your Google Cloud Storage destination bucket, specified in the following format: /projects/{project-id}/buckets/{bucket-name}.
Note: The bucket must be created in the same region as your Google SecOps tenant.
includeLogTypes array optional The updated, comma-separated list of one or more log types you want to export. If this field is included but the value is left blank, all log types are exported by default.
fieldMask string required A list of keys to be updated. To update all fields sent in the request object, set this value to `*`.
Sample response

Upon a successful request, the API returns a confirmation of the update request using a response containing the updated field values for the specified job name, along with an updated estimate of the data volume to be exported.

{
  "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
  "startTime": "2025-08-01T00:00:00Z",
  "endTime": "2025-08-03T00:00:00Z",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket2",
  "includeLogTypes": [
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS"
  ],
  "dataExportStatus": {
    "stage": "IN_QUEUE"
  },
  "estimatedVolume": "15737418240",
  "createTime": "2025-08-13T12:00:00Z",
  "updateTime": "2025-08-13T12:05:00Z"
}
Response Parameters
Parameter Type Description
name string The unique name of the updated data export job.
startTime string The updated starting time range
endTime string The updated ending time range
gcsBucket string The updated path to your Google Cloud Storage destination bucket, specified in the following format: /projects/{project-id}/buckets/{bucket-name}.
includeLogTypes list The updated comma-separated list of included log types.
dataExportStatus.stage string The status of the export job at the time of update (always IN_QUEUE).
estimatedVolume string The updated estimated export volume in bytes.
createTime string The original job creation time.
updateTime string The job update time.

CancelDataExport

You can only cancel an existing job when it is in the IN_QUEUE status.

Reference Documentation: Method: dataExports.cancel

Request

Endpoint: POST https://chronicle.{region}.rep.googleapis.com/v1alpha/{name}:cancel

Path parameters
Field Type Required Description
name string required The name of the data export job to cancel, specified in the following format:
projects/{project}/locations/{region}/instances/{instance}/dataexports/{id}
where:
{project}: Identifier of your project.
{region}: Region where your destination bucket is located. See the list of regions.
{instance}: Identifier of the source Google SecOps instance.
{id}: UUID identifier of the data export request.
Sample request
{
  "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d"
}
Request body

The request body must be empty.

Sample response

A successful response will show the job's status as CANCELLED.

{
  "name": "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/dataExports/b4a3c2d1-e8f7-6a5b-4c3d-2e1f0a9b8c7d",
  "startTime": "2025-08-01T00:00:00Z",
  "endTime": "2025-08-02T00:00:00Z",
  "gcsBucket": "projects/chronicle-test/buckets/dataexport-test-bucket",
  "includeLogTypes": [
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_DNS",
    "projects/myproject/locations/us/instances/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/logTypes/GCP_FIREWALL"
  ],
  "dataExportStatus": {
    "stage": "CANCELLED"
  },
  "estimatedVolume": "10737418240",
  "createTime": "2025-08-13T11:00:00Z",
  "updateTime": "2025-08-13T11:10:00Z"
}

Troubleshooting common issues

The API provides detailed error messages to help diagnose problems.

Canonical Code Error Message
INVALID_ARGUMENT INVALID_REQUEST: Invalid request parameter <Parameter1, Parameter2,..>. Please fix the request parameters and try again.
NOT_FOUND BUCKET_NOT_FOUND: The destination Google Cloud Storage bucket <bucketName> does not exist. Please create the destination Google Cloud Storage bucket and try again.
NOT_FOUND REQUEST_NOT_FOUND: The dataExportId:<dataExportId> does not exist. Please add a valid dataExportId and try again.
FAILED_PRECONDITION BUCKET_INVALID_REGION: The Google Cloud Storage bucket <bucketId>'s region:<region1> is not the same region as the SecOps tenant region:<region2>. Please create the Google Cloud Storage bucket in the same region as SecOps tenant and try again.
FAILED_PRECONDITION INSUFFICIENT_PERMISSIONS: The Service Account <P4SA> does not have storage.objects.create, storage.objects.get and storage.buckets.get permissions on the destination Google Cloud Storage bucket <bucketName>. Please provide the required access to the Service Account and try again.
FAILED_PRECONDITION INVALID_UPDATE: The request status is in the <status> stage and cannot be updated. The request can only be updated if the status in the IN_QUEUE stage.
FAILED_PRECONDITION INVALID_CANCELLATION: The request status is in the <status> stage and cannot be cancelled. The request can only be cancelled if the status is in the IN_QUEUE stage.
RESOURCE_EXHAUSTED CONCURRENT_REQUEST_LIMIT_EXCEEDED: Maximum concurrent requests limit <limit> reached for the request size <sizelimit>. Please wait for the existing requests to complete and try again.
RESOURCE_EXHAUSTED REQUEST_SIZE_LIMIT_EXCEEDED: The estimated export volume: <estimatedVolume> for the request is greater than maximum allowed export volume: <allowedVolume> per request. Please try again with a request within the allowed export volume limit.
INTERNAL INTERNAL_ERROR: An Internal error occurred. Please try again.

Need more help? Get answers from Community members and Google SecOps professionals.