Write and query log entries using a Python script

This quickstart introduces you to some of the capabilities of Cloud Logging and shows you how to do the following:

  • Write log entries using a Python script.
  • View log entries using a Python script.
  • Delete log entries using a Python script.
  • Route logs to a Cloud Storage bucket.

Logging can route log entries to the following destinations:

  • Cloud Storage buckets
  • BigQuery datasets
  • Pub/Sub
  • Logging buckets
  • Google Cloud projects

Before you begin

You must have a Google Cloud project with billing enabled to complete this quickstart. If you don't have a Google Cloud project, or if you don't have billing enabled for your Google Cloud project, do the following:
  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  5. Make sure that billing is enabled for your Google Cloud project.

This quickstart uses Cloud Logging and Cloud Storage. Use of these resources can incur a cost to you. When you finish this quickstart, you can avoid continued billing by deleting the resources that you created. See Clean up on this page for more details.

Getting started

You can use the Cloud Shell environment or a generic Linux environment to complete this quickstart. Python is preinstalled in the Cloud Shell.

Cloud Shell

  1. Open the Cloud Shell and verify your Google Cloud project configuration:

    1. From the Google Cloud console, click Activate Cloud Shell:

      In the Google Cloud console, click the Cloud Shell icon.

      The Cloud Shell opens in a window and displays a welcome message:

      Welcome to Cloud Shell message.

    2. The welcome message echoes the configured Google Cloud project ID. If this isn't the Google Cloud project that you want to use, run the following command after replacing PROJECT_ID with your project's ID:

       gcloud config set project PROJECT_ID
      

Linux

  1. Ensure that Python is installed and configured. For information about preparing your machine for Python development, see Setting up a Python development environment.

  2. Install the Cloud Logging client library:

    pip install --upgrade google-cloud-logging
    
  3. Set up the Identity and Access Management permissions for your Google Cloud project. In the following steps, you create a service account for your Google Cloud project, and then you generate and download a file to your Linux workstation.

    1. In the navigation panel of the Google Cloud console, select IAM & Admin, and then select Service Accounts:

      Go to Service Accounts

    2. Select your quickstart Google Cloud project, and then click Create Service Account:

      • Enter an account name.
      • Enter an account description.
      • Click Create and continue.
    3. Click the Select a role field and select Logging Admin.

    4. Click Done to finish creating the service account.

    5. Create a key file and download it to your workstation:

      • For your service account, click More options, and select Manage keys.
      • In the Keys pane, click Add key.
      • Click Create new key.
      • For the Key type, select JSON and then click Create. After a moment, a pop-up window displays a message similar to the one shown below:

        Private key saved save to your computer.

  4. On your Linux workstation, provide your authentication credentials to your application by setting the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path to your key file. For example:

     export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/FILE_NAME.json"
    

    This environment variable only applies to your current shell session, so if you open a new session, set the variable again.

Clone source

To configure your Cloud Shell for this quickstart, do the following:

  1. Clone the GitHub project python-logging:

      git clone https://github.com/googleapis/python-logging
    

    The directory samples/snippets contains the two scripts used in this quickstart:

    • snippets.py lets you manage entries in a log.
    • export.py lets you manage log exports.
  2. Change to the snippets directory:

      cd python-logging/samples/snippets
    

Write log entries

The snippets.py script uses the Python client libraries to write log entries to Logging. When the write option is specified on the command line, the script writes the following log entries:

  • An entry with unstructured data and no specified severity level.
  • An entry with unstructured data and a severity level of ERROR.
  • An entry with JSON structured data and no specified severity level.

To write new log entries to the log my-log, run the snippets.py script with the write option:

python snippets.py my-log write

View log entries

To view the log entries in the Cloud Shell, run the snippets.py script with the list option:

python snippets.py my-log list

The command completes with the result:

    Listing entries for logger my-log:
    * 2018-11-15T16:05:35.548471+00:00: Hello, world!
    * 2018-11-15T16:05:35.647190+00:00: Goodbye, world!
    * 2018-11-15T16:05:35.726315+00:00: {u'favorite_color': u'Blue', u'quest': u'Find the Holy Grail', u'name': u'King Arthur'}

If the result doesn't show any entries, then retry the command. It takes a few moments for Logging to receive and process log entries.

You can also view your log entries by using the Logs Explorer. For more details, see View logs by using the Logs Explorer.

Delete log entries

To delete all of the log entries in the log my-log, run the snippets.py script with the option delete:

python snippets.py my-log delete

The command completes with the result:

Deleted all logging entries for my-log.

Route logs

In this section, you do the following:

  • Create a Cloud Storage bucket as the destination for your data.
  • Create a sink that copies new log entries to the destination.
  • Update the permissions of your Cloud Storage bucket.
  • Write log entries to Logging.
  • Optionally, verify the content of your Cloud Storage bucket.

Create destination

The export destination for this quickstart is a Cloud Storage bucket. To create a Cloud Storage bucket, do the following:

  1. In the navigation panel of the Google Cloud console, select Cloud Storage, and then click Buckets:

    Go to Buckets

  2. Click Create bucket.
  3. Enter a name for your bucket.
  4. For Location type, select Region, which selects a bucket location with the lowest latency.
  5. For Default storage class, select Standard.
  6. For Access control, select Fine-grained.
  7. For Protection tools, select None, and then click Create.

This quickstart uses a Cloud Storage bucket name of myloggingproject-1.

Create sink

A sink is a rule that determines if Logging routes a newly arrived log entry to a destination. A sink has three attributes:

  • Name
  • Destination
  • Filter

For more information about sinks, see Sinks.

If a newly arrived log entry meets the query conditions, then that log entry is routed to the destination.

The export.py script uses the Python client libraries to create, list, modify and delete sinks. To create the sink mysink that exports all log entries with a severity of at least INFO to the Cloud Storage bucket myloggingproject-1, run the following command:

python export.py create mysink myloggingproject-1 "severity>=INFO"

To view your sinks, run the export.py script with the list option:

python export.py list

The script returns the following:

    mysink: severity>=INFO -> storage.googleapis.com/myloggingproject-1

Update destination permissions

The permissions of the destination, in this case, your Cloud Storage bucket, aren't modified when you create a sink by using the export.py script. You must change the permission settings of your Cloud Storage bucket to grant write permission to your sink. For information about service accounts, access scopes, and Identity and Access Management roles, see Service Accounts.

To update the permissions on your Cloud Storage bucket:

  1. Identify your sink's Writer Identity:

    1. In the navigation panel of the Google Cloud console, select Logging, and then select Log Router:

      Go to Log Router

      You see a summary table of your sinks.

    2. Find your sink in the table, select Menu, and then select View sink details.

    3. Copy the writer identity to your clipboard.

  2. In the navigation panel of the Google Cloud console, select Cloud Storage, and then click Buckets:

    Go to Buckets

  3. To open the detailed view, click the name of your bucket.

  4. Select Permissions and click Grant Access.

  5. Paste the writer identity into the New principals box. Remove the serviceAccount: prefix from the writer identity address.

  6. Set the Role to Storage Object Creator, then click Save.

For more information, see Set destination permissions.

Validate sink

To validate that your sink and destination are properly configured, do the following:

  1. Write new log entries to the log my-log:

    python snippets.py my-log write
    
  2. View your Cloud Storage bucket's contents:

    1. In the navigation panel of the Google Cloud console, select Cloud Storage, and then click Buckets:

      Go to Buckets

    2. To open the detailed view, click the name of your bucket. The detailed view lists the folders that contain data. If there isn't data in your bucket, the following message is displayed:

      There are no live objects in this bucket.

      As described in Late-arriving log entries, it might take 2 or 3 hours before the first entries appear at the destination, or before you are notified of a configuration error.

      After your bucket has received data, the detailed view shows a result similar to:

      Bucket contents detailed view.

    3. The data in each folder is organized in a series of folders labeled with the top-level folder consisting of a log name, and then successively, the year, month, and day. To view the data that was exported by your sink, click the folder name my-log, and then continue clicking through the year, month, and day subfolders until you reach a file that ends with json:

      Bucket contents subfolder view.

    4. The JSON file contains the log entries that were exported to your Cloud Storage bucket. Click the name of the JSON file to see its contents. The contents are similar to:

       {"insertId":"yf1cshfoivz48",
       "logName":"projects/loggingproject-222616/logs/my-log",
       "receiveTimestamp":"2018-11-15T23:06:14.738729911Z",
       "resource":{"labels":{"project_id":"loggingproject-222616"},"type":"global"},
       "severity":"ERROR",
       "textPayload":"Goodbye, world!",
       "timestamp":"2018-11-15T23:06:14.738729911Z"}
      

      Because the severity level of ERROR is greater than the severity level of INFO, the log entry containing the string '"Goodbye, world!"' is exported to the sink destination. The other log entries that were written weren't exported to the destination because their severity level was set to the default value, and the default severity level is less than INFO.

Troubleshooting

There are several reasons why a Cloud Storage bucket might be empty:

  • The bucket hasn't received data. It might take 2 or 3 hours before the first entries appear at the destination, or before you are notified of a configuration error. For more information, see Late-arriving log entries.

  • There is a configuration error. In this case, you will receive an email message similar to the following subject line:

     [ACTION REQUIRED] Logging export config error in myloggingproject.

    The content of the email body describes the configuration issue. For example, if you don't update your destination permissions, then the email lists the following error code:

     bucket_permission_denied

    To correct this particular condition, see Update destination permissions on this page.

  • No log entries were written after the sink was created. The sink is applied only to newly arriving log entries. To correct this situation, write new log entries:

     python snippets.py my-log write
    

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.

  1. (Optional) Delete the log entries you created. If you don't delete your log entries, they will expire and be removed. See Quotas and limits.

    To delete all log entries in the log my-log, run the following command:

     python snippets.py my-log delete
    
  2. Delete your Google Cloud project or delete your quickstart resources.

    • To delete your Google Cloud project, from the Google Cloud console Project Info pane, click Go to project settings, and then click Shut down.

    • To delete your quickstart resources:

      1. Delete your sink by running the following command:

        python export.py delete mysink
        
      2. Delete your Cloud Storage bucket. Go to the Google Cloud console and click Storage > Buckets. Place a check in the box next to your bucket name and then click Delete.

What's next