The Cloud Logging API lets you programmatically read and write log entries, set up exclusions, create log-based metrics, and manage sinks to route logs. See the following reference documentation for the API:
- For the REST version of the API, see REST reference.
- For the gRPC version of the API, see gRPC reference.
Ways to interact with the API
Users typically invoke the API by using a command-line interface or a client library written to support a high-level programming language. See the following reference documentation:
For the command-line interface to the Logging API, see the
For instructions on setting up client libraries and authorizing the Logging API, with sample code, see Client libraries.
To try out the API without writing any code, you can make calls from the REST reference pages from your browser using the APIs Explorer. APIs Explorer is a widget attached to the REST API reference page for a method. It appears as a panel with the title Try this API. Using the API Explorer describes using the widget with the Cloud Monitoring API; it works the same way for the Logging API.
Tips for using the API
Following are some tips for using the Logging API.
Bulk processing of log entries
The method you use to retrieve log entries is
entries.list, but this method isn't intended for
high-volume retrieval of log entries. Using this method in this way might
quickly exhaust your quota for read requests.
If you need to perform bulk processing on log entries, the recommended approach is to create a Pub/Sub sink that routes the log entries you want to process to a Pub/Sub topic, and then consume the log entries from there.
This approach has the following advantages:
- It doesn't exhaust your read-request quota. For more on quotas, see Logging usage limits.
- It captures log entries that might have been written out of order, without workarounds to seek back and re-read recent entries to ensure nothing was missed.
- It automatically buffers the log entries if the logs consumer becomes unavailable.
- It doesn't require you to send the logs to Cloud Logging storage, so they don't count against your ingestion allowance. For more information, see Pricing: Logging details.
The approach of creating Pub/Sub sinks for log routing can be applied to a variety of analytics platforms. The following documents illustrate this approach with different target platforms: