This page describes how to manually copy log entries that are already stored in Cloud Logging buckets to Cloud Storage buckets.
You might want to copy log entries from Logging buckets to Cloud Storage buckets for the following reasons:
- In case you forgot to route log entries to Cloud Storage prior to their storage in Logging.
- To share log entries with auditors outside of Logging.
- To analyze log entries with scripts in Cloud Storage.
When you copy log entries to Cloud Storage, the log entries also remain in the log bucket they were copied from.
Note that copy operations don't replace sinks, which automatically send all incoming log entries to a pre-selected supported storage destination, including Cloud Storage. Use sinks when you know in advance where you want to store log entries.
To copy logs and then manage and monitor the operation, you use the
gcloud tool as described in the following sections.
The following limitations apply when copying log entries:
This feature isn't available in the Cloud Console or Logging API; use the
gcloudtool as described.
You can copy logs to Cloud Storage buckets only; other destinations aren't available.
Before you begin
Before you get started with copying logs, do the following:
Verify that you have the correct Identity and Access Management permissions:
To copy log entries from Logging and then write the log entries to Cloud Storage, you must have both of the following:
For the Logging bucket from which you're copying the logs, you need the
roles/logging.adminrole or a custom role with the
For the Cloud Storage bucket to which you're sending the logs, you need the
roles/storage.objectCreatorrole or a custom role with the
To view and manage the status of a copy operation, you must have one of the following roles or a custom role with the
Copy log entries
Logging only copies log entries that are stored in the log bucket when the copy operation starts. Log entries that are ingested and stored after the copy operation starts don't get copied to Cloud Storage.
To copy log entries to Cloud Storage, you need to know the following information:
- The ID of the Logging bucket you're copying from
- Use the
gcloud logging buckets listto retrieve this information.
- Use the
- The specific location of the Logging bucket you're copying from
- See Get Logging bucket location on this page to retrieve this information.
- The ID of the Cloud Storage bucket you're copying to
- See Getting Cloud Storage bucket information to retrieve this information.
- A filter for the log entries you want to copy
To copy log entries, run the
gcloud alpha logging copy command:
gcloud alpha logging copy LOGGING_BUCKET_ID storage.googleapis.com/CLOUD_STORAGE_BUCKET_NAME --location=LOCATION [--log-filter=FILTER] --project=PROJECT_ID
gcloud alpha logging copy my-log-bucket storage.googleapis.com/my-gcs-bucket \ --location=global --log-filter='timestamp > "2021-03-18T10:00:00.0Z"' \ --project=my-project
This command creates a long-running operation to run in the background and returns the name of the copy operation and the location of the log bucket:
The location of the copy operation is the same as the location of the log
bucket from which you're copying logs. For Logging buckets that
were created with the
global location, the actual location of the bucket is
returned when you run the copy operation.
Get Logging bucket location
To run the
gcloud alpha logging operations commands, you need the specific location
in which Logging created the bucket. The log bucket's regional
location is returned after you run the
gcloud alpha logging copy command.
If you don't know the specific location of the Logging bucket,
possibly because it was created using the
global location, complete the
Find the geographic locations your Cloud project uses:
gcloud alpha logging locations list
List the operations by location until you find your copy operation.
operations list, you must specify the type of operation using the
--operation-filterflag. Possible values for
gcloud alpha logging operations list --location=LOCATION \ --operation-filter=request_type=REQUEST_TYPE \ --project=PROJECT_ID
The command returns information about the long-running operation, including the operation ID:
View and manage copy operations
You can view and manage your copy operations by using the
gcloud alpha logging operations commands, which let you do the following:
- List all current copy operations
- View the status of an operation
- Cancel an in-progress copy operation
List copy operations
You can list recent copy operations, including scheduled, running, completed, failed, and cancelled operations. Recent copy operations appear in the results for up to 5 days after the end time.
To list copy operations, run the following command:
gcloud alpha logging operations list --location=LOCATION \ --operation-filter=request_type=CopyLogEntries \ --project=PROJECT_ID
View the status of a copy operation
You can retrieve the status and other metadata about copy operations, including the following:
startTime: The timestamp indicating the creation of the operation.
endTime: The timestamp indicating the completion of the operation.
state: The status of the operation (scheduled, running, cancelled, failed, or succeeded).
cancellation: Whether the user has requested to cancel the operation.
progress: Estimated progress of the operation (0-100%)
destination: The name of the Cloud Storage bucket to which the operation is copying logs.
filter: The filter specifying which log entries to copy.
name: The name of the Logging bucket from which the operation is copying logs.
logEntriesCopiedCount: The number of log entries successfully copied to the Cloud Storage bucket by the operation.
Note that not all of the listed metadata fields apply to every copy operation.
For example, if a copy operation is still running, the
wouldn't apply to the operation. As another example, if the
--log-filter=FILTER flag wasn't used when running the
gcloud alpha logging copy command, then the
filter metadata wouldn't apply
to the operation.
To retrieve information about a copy operation, run the following command:
gcloud alpha logging operations describe OPERATION_ID \ --location=LOCATION --project=PROJECT_ID
The command returns metadata about the copy operation. For example, here is an output for an operation that is in progress:
done: false metadata:
@type: type.googleapis.com/google.logging.v2.CopyLogEntriesMetadata progress: 75 request: destination: storage.googleapis.com/my-storage-bucket-1 filter: “timestamp > 2021-05-23T10:00:00.0Z" name: projects/my-test-project/locations/us-central1/buckets/my-logging-bucket-2 startTime:
Cancel a copy operation
You can cancel an in-progress copy operation. If you cancel a copy operation, all log entries that were copied before the operation was cancelled remain in the Cloud Storage bucket.
After cancelling a copy operation, Logging completes all ongoing processes before it completes the cancellation. This might result in some log entries still being copied to Cloud Storage after you cancel the operation.
To cancel a copy operation, run the following command:
gcloud alpha logging operations cancel OPERATION_ID \ --location=LOCATION --project=PROJECT_ID
View logs in Cloud Storage
To view and understand the logs that you copied to Cloud Storage, see Logs organization.
Quotas and limits
For information on quotas, see Cloud Logging API quotas.
To copy a large volume—for example, petabytes—split the copying
across multiple copy operations by using the
timestamp field in the
Cloud Logging doesn't charge you for copying logs, but Cloud Storage destination charges might apply. For details, see Cloud Storage pricing.