Batch and route logs retroactively

This page describes how to manually copy log entries that are already stored in Cloud Logging log buckets to Cloud Storage buckets. You might want to copy log entries from log buckets to Cloud Storage buckets for the following reasons:

  • In case you forgot to route log entries to Cloud Storage prior to their storage in Logging.
  • To share log entries with auditors outside of Logging.
  • To analyze log entries with scripts in Cloud Storage.

When you copy log entries to Cloud Storage, the log entries also remain in the log bucket they were copied from.

Copy operations don't replace sinks, which automatically send all incoming log entries to a pre-selected supported storage destination, including Cloud Storage buckets. Use sinks when you know that you want to store log entries in a Cloud Storage bucket.

To copy logs and then manage and monitor the operation, you must use the Google Cloud CLI.

Limitations

The following limitations apply when copying log entries:

  • You can copy logs to Cloud Storage buckets only; other destinations aren't available.

  • You can't copy logs from log buckets that have CMEK configured.

Before you begin

Before you get started with copying logs, do the following:

  • To get the permissions that you need to copy log entries from Logging and then write the log entries to Cloud Storage, ask your administrator to grant you the following IAM roles on your project:

  • To get the permissions that you need to view and manage the status of a copy operation, ask your administrator to grant you the Logs Configuration Writer (roles/logging.configWriter) IAM role on your project.

Copy log entries

Logging only copies log entries that are stored in the log bucket when the copy operation starts. Log entries that are stored in log buckets after the copy operation starts don't get copied to Cloud Storage.

To copy log entries to Cloud Storage, you need to know the following information:

  • The ID and location of the log bucket you're copying from. To retrieve the log bucket ID and its location, use the gcloud CLI command gcloud logging buckets list.
  • The name of the Cloud Storage bucket you're copying to. For information about how to retrieve the Cloud Storage bucket name, see Getting Cloud Storage bucket information.
  • A filter for the log entries you want to copy.

To copy log entries, run the gcloud logging copy command:

gcloud logging copy BUCKET_ID storage.googleapis.com/CLOUD_STORAGE_BUCKET_NAME \
--location=LOCATION --log-filter='FILTER' --project=PROJECT_ID

Before you run the previous command, do the following:

  • Replace BUCKET_ID with the name of your log bucket.
  • Replace CLOUD_STORAGE_BUCKET_NAME with the name of your Cloud Storage bucket.
  • Replace LOCATION with the location of the log bucket.
  • (Optional): Replace FILTER with the filter that defines which logs are copied.

    If you omit the --log-filter flag, then all log entries in the log bucket are copied to the Cloud Storage bucket.

  • Replace PROJECT_ID with your Google Cloud project ID. You can omit this flag when the active gcloud CLI configuration is set to the correct Google Cloud project.

Example command:

gcloud logging copy my-log-bucket storage.googleapis.com/my-gcs-bucket \
--location=global --log-filter='timestamp > "2024-07-18T10:00:00.0Z"' \
--project=my-project

This command creates a long-running operation to run in the background and returns the name of the copy operation and the location of the log bucket:

name: projects/PROJECT_ID/locations/LOCATION/operations/OPERATION_ID

The location of the copy operation is the same as the location of the log bucket from which you're copying logs.

View and manage copy operations

You can view and manage your copy operations by using the gcloud logging operations commands, which let you list, view and cancel operations.

The following commands require that you specify the location of the operation. Use the location of your log bucket. For information about how to find the location of your log bucket, see View a bucket's details.

List copy operations

You can list recent copy operations, including scheduled, running, completed, failed, and cancelled operations. Recent copy operations appear in the results for up to 30 days after the end time.

To list copy operations, run the following command:

gcloud logging operations list --location=LOCATION \
--operation-filter=request_type=CopyLogEntries \
--project=PROJECT_ID

Before you run the previous command, do the following:

  • Replace LOCATION with the location of the log bucket from which you're copying logs.
  • Replace PROJECT_ID with your Google Cloud project ID.

The command returns information about the long-running operation, including the operation ID:

projects/PROJECT_ID/locations/LOCATION/operations/OPERATION_ID

View the status of a copy operation

You can retrieve the status and other metadata about copy operations, including the following:

  • startTime: The timestamp indicating the creation of the operation.
  • endTime: The timestamp indicating the completion of the operation.
  • state: The status of the operation (scheduled, running, cancelled, failed, or succeeded).
  • cancellation: Whether the user has requested to cancel the operation.
  • progress: Estimated progress of the operation (0-100%).
  • destination: The name of the Cloud Storage bucket to which the operation is copying logs.
  • filter: The filter specifying which log entries to copy.
  • name: The name of the log bucket from which the operation is copying logs.
  • logEntriesCopiedCount: The number of log entries successfully copied to the Cloud Storage bucket by the operation.

Note that not all of the listed metadata fields apply to every copy operation. For example, if a copy operation is still running, the endtime metadata wouldn't apply to the operation. As another example, if the --log-filter=FILTER flag wasn't used when running the gcloud logging copy command, then the filter metadata wouldn't apply to the operation.

To retrieve information about a copy operation, run the following command:

gcloud logging operations describe OPERATION_ID \
--location=LOCATION --project=PROJECT_ID

Before you run the previous command, do the following:

  • Replace OPERATION_ID with the ID of the operation.
  • Replace LOCATION with the location of the log bucket from which you're copying logs.
  • Replace PROJECT_ID with your Google Cloud project ID.

The command returns metadata about the copy operation. For example, here is an output for an operation that is in progress:

done: false
metadata:
  `@type`: type.googleapis.com/google.logging.v2.CopyLogEntriesMetadata
  progress: 75
  destination: storage.googleapis.com/my-storage-bucket-1
  source: projects/my-test-project/locations/us-central1/buckets/my-logging-bucket-2
  verb: copy
  startTime: `2024-05-23T10:52:40.039751Z`
  state: OPERATION_STATE_RUNNING
name: projects/my-test-project/locations/us-central1/buckets/my-logging-bucket-2
   </pre>

Cancel a copy operation

You can cancel an in-progress copy operation. If you cancel a copy operation, all log entries that were copied before the operation was cancelled remain in the Cloud Storage bucket.

After cancelling a copy operation, Logging completes all ongoing processes before it completes the cancellation. This might result in some log entries still being copied to Cloud Storage after you cancel the operation.

To cancel a copy operation, run the following command:

gcloud logging operations cancel OPERATION_ID \
--location=LOCATION --project=PROJECT_ID

Before you run the previous command, do the following:

  • Replace OPERATION_ID with the ID of the operation.
  • Replace LOCATION with the location of the log bucket from which you're copying logs.
  • Replace PROJECT_ID with your Google Cloud project ID.

View logs in Cloud Storage

To view and understand the logs that you copied to Cloud Storage, see View logs routed to Cloud Storage.

Quotas and limits

All copy operations take at least an hour to complete, no matter the amount of data that is being copied.

To copy a large volume—for example, petabytes—split the copying across multiple copy operations by using the timestamp field in the --filter flag.

The copy command cannot copy log entries whose retention has expired.

Pricing

Cloud Logging doesn't charge to route logs to a supported destination; however, the destination might apply charges. With the exception of the _Required log bucket, Cloud Logging charges to stream logs into log buckets and for storage longer than the default retention period of the log bucket.

Cloud Logging doesn't charge for copying logs, for defining log scopes, or for queries issued through the Logs Explorer or Log Analytics pages.

For more information, see the following documents: