Quickstart using Python

In this quickstart, you execute Python programs to write, read, delete, and export log entries.

Before you begin

You must have a Google Cloud project with billing enabled to complete this quickstart. If you don't have a Google Cloud project, or if you don't have billing enabled for your project, do the following:
  1. Sign in to your Google Account.

    If you don't already have one, sign up for a new account.

  2. In the Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to the project selector page

  3. Make sure that billing is enabled for your Google Cloud project. Learn how to confirm billing is enabled for your project.

This quickstart uses Stackdriver Logging and Cloud Storage. Use of these resources can incur a cost to you. When you finish this quickstart, you can avoid continued billing by deleting the resources that you created. See Clean up for more detail.

Getting started

You can use the Cloud Shell environment or a generic Linux environment to complete this quickstart.

Cloud Shell

  1. Python versions 2.7 and 3.5 are pre-installed in Cloud Shell. You don't need to install or configure any other software.

  2. Open the Cloud Shell and verify your project configuration:

    1. From the Cloud Console, click the Activate Cloud Shell button in the upper right-hand corner:

      Activate Cloud Shell

      The Cloud Shell opens in a window and displays a welcome message:

      Welcome to Cloud Shell

    2. The welcome message echoes the configured project ID. If this isn't the project that you want to use, run the following command after replacing [PROJECT_ID] with your project ID:

       gcloud config set project [PROJECT_ID]
      

Linux

  1. Install and configure Python. You can use Python versions 2 or 3 for this quickstart. See Setting up a Python development environment for details.

  2. Set up the Cloud Identity and Access Management permissions for your project. In the following steps, you create a service account for your project, and then you generate and download a file to your Linux workstation.

    1. In the Cloud Console, go to IAM & admin > Service accounts:

      Go to Service account

    2. Select your quickstart project, and then click Create Service Account:

      • Enter an account name.
      • Enter an account description.
      • Click Create.
    3. On the Service account permissions (optional) pane, for the Role, select Logging Admin from the drop-down list. Click Continue.

    4. Skip the option to grant users access to the service account.

    5. Create a key file and download it to your workstation:

      • In the Create key (optional) pane, click Create key.
      • For the key type, select JSON, and then click Create. After a moment, a pop-up window displays a message similar to the one shown below:

        Private key saved

    6. To complete the creation of your service account, click Done.

  3. On your Linux workstation, provide your authentication credentials to your application by setting the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path to your key file. For example:

     export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/[FILE_NAME].json"
    

    This environment variable only applies to your current shell session, so if you open a new session, set the variable again.

Clone source

Clone the GitHub project python-docs-samples:

git clone https://github.com/GoogleCloudPlatform/python-docs-samples

The directory python-docs-samples/logging/cloud-client contains the two programs used in this quickstart:

  • snippets.py lets you manage entries in a log.
  • export.py lets you manage log exports.

To change to the program directory, run the following command:

cd python-docs-samples/logging/cloud-client

Write log entries

The snippets.py program uses the Python client libraries to write log entries to Logging. When the write option is specified on the command line, the program writes the following log entries:

  • An entry with unstructured data and no specified severity level.
  • An entry with unstructured data and a severity level of ERROR.
  • An entry with JSON structured data and no specified severity level.

To write new log entries to the log my-log, run the snippets.py program with the write option:

python snippets.py my-log write

View log entries

To view the log entries in the Cloud Shell, run the snippets.py program with the list option:

python snippets.py my-log list

After a few moments, the command completes and the result is something like:

    Listing entries for logger my-log:
    * 2018-11-15T16:05:35.548471+00:00: Hello, world!
    * 2018-11-15T16:05:35.647190+00:00: Goodbye, world!
    * 2018-11-15T16:05:35.726315+00:00: {u'favorite_color': u'Blue', u'quest': u'Find the Holy Grail', u'name': u'King Arthur'}

If the result shows no entries, retry the command. It takes a few moments for Logging to received and process log entries.

You can also view your log entries by using the Logs Viewer. See View logs in the Logs Viewer for more details.

Delete log entries

To delete all of the log entries in the log my-list, run the snippets.py program with the option delete:

python snippets.py my-log delete

After a few moments, the command completes with the result:

Deleted all logging entries for my-log.

Export logs

Logging can export log entries to Cloud Storage buckets, BigQuery datasets, and to Pub/Sub. For detailed information on exporting, see Overview of Logs Exports.

In this section, you do the following:

  • Create a Cloud Storage bucket as the destination for your data.
  • Create a sink that copies new log entries to the destination.
  • Update the permissions of your Cloud Storage bucket.
  • Write log entries to Logging.
  • Optionally, verify the content of your Cloud Storage bucket.

Create destination

The export destination for this quickstart is a Cloud Storage bucket. To create a Cloud Storage bucket:

  1. In the Cloud Console, go to Storage > Browser:

    Go to Browser

  2. Click Create bucket.

  3. Select a name for your bucket.

  4. Select Regional and chose the closest geographic option for the Location.

  5. For the Access control model, select Set object-level and bucket-level permissions.

  6. Leave all other settings at their default value. Click Create.

This quickstart uses a Cloud Storage bucket name of myloggingproject-1.

Create sink

A sink is a rule that determines if Logging exports a newly arrived log entry to a destination. A sink has three attributes:

  • Name
  • Destination
  • Query

If a newly arrived log entry meets the query conditions, then that log entry is exported to the destination.

The export.py program uses the Python client libraries to create, list, modify and delete sinks. To create the sink mysink that exports all log entries with a severity of at least INFO to the Cloud Storage bucket myloggingproject-1, run the following command:

python export.py create mysink myloggingproject-1 "severity>=INFO"

To view your sinks, run the export.py program with the list option:

python export.py list

The result looks like the following:

    mysink: severity>=INFO -> storage.googleapis.com/myloggingproject-1

Update destination permissions

The permissions of the destination, in this case, your Cloud Storage bucket, aren't modified when you create a sink by using the export.py program. You must change the permission settings of your Cloud Storage bucket to grant write permission to your sink.

To update the permissions on your Cloud Storage bucket:

  1. Identify your sinks' Writer Identity:

    1. Go to the Logs Viewer page:

      Go to the Logs Viewer page

    2. Select Exports to see a summary of your sinks, including the sinks' Writer Identity.

  2. From the Cloud Console, click Storage > Browser:

    Go to Browser

  3. To open the detailed view, click the name of your bucket.

  4. Select Permissions and click Add members.

  5. Set the Role to Storage Object Creator and enter your sinks' writer identity.

See Destination permissions for more information.

Validate sink

To validate that your sink and destination are properly configured:

  1. Write new log entries to the log my-log:

    python snippets.py my-log write
    
  2. View your Cloud Storage bucket contents:

    1. From the Cloud Console, click Storage > Browser:

      Go to Browser

    2. To open the detailed view, click the name of your bucket. The detailed view lists the folders that contain data. If there is no data in your bucket, the following message is displayed:

      There are no live objects in this bucket.

      As described in Exported logs availability, it might take 2 or 3 hours before the first entries appear at the destination, or before you are notified of a configuration error.

    3. After your bucket has recevied data, the detail view shows a result similar to:

      Bucket contents

    4. The data in each folder is organized in a series of folders labeled with the top-level folder consisting of a log name, and then successively, the year, month, and day. To view the data that was exported by your sink, click the folder name my-logs, and then continue clicking through the year, month, and day subfolders until you reach a file that ends with json:

      Bucket contents

    5. The JSON file contains the log entries that were exported to your Cloud Storage bucket. Click the name of the JSON file to see its contents. The contents is similar to:

       {"insertId":"yf1cshfoivz48",
       "logName":"projects/loggingproject-222616/logs/my-log",
       "receiveTimestamp":"2018-11-15T23:06:14.738729911Z",
       "resource":{"labels":{"project_id":"loggingproject-222616"},"type":"global"},
       "severity":"ERROR",
       "textPayload":"Goodbye, world!",
       "timestamp":"2018-11-15T23:06:14.738729911Z"}
      

      Because the severity level of ERROR is greater than the severity level of INFO, the log entry containing the string '"Goodbye, world!"' is exported to the sink destination. The other log entries that were written weren't exported to the destination because their severity level was set to the default value, and the default severity level is less than INFO.

Troubleshooting

There are several reasons why a Cloud Storage bucket might be empty:

  • You haven't waited long enough for the data to appear in the bucket. It might take 2 or 3 hours before the first entries appear at the destination, or before you are notified of a configuration error. See Exported logs availability for details.

  • You have a configuration error. In this case, you will receive an email message similar to the following subject line:

     [ACTION REQUIRED] Stackdriver Logging export config error in myloggingproject.

    The content of the email body describes the configuration issue. For example, if you did not update your destination permissions, the following error code listed:

     bucket_permission_denied

    To correct this particular condition, see Update permissions.

  • You didn't write log entries after you created the sink. The sink is applied only to newly arriving log entries. To correct this situation, write new log entries:

     python snippets.py my-log write
    

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this quickstart, follow these steps.

  1. (Optional) Delete the log entries you created. If you don't delete your log entries, they will expire and be removed. See Quota Policy. To delete all log entries in the log my-log, run the following command:

     python snippets.py my-log delete
    
  2. Delete your project or delete your quickstart resources.

    • To delete your project, from the Cloud Console Project Info pane, click Go to project settings, and then click Shut down.

    • To delete your quickstart resources:

      1. Delete your sink by running the following command:

        python export.py delete mysink
        
      2. Delete your Cloud Storage bucket. Go to the Cloud Console and click Storage > Browser. Place a check in the box next to your bucket name and then click Delete.

What's next

  • See Service Accounts for a detailed discussion of service accounts, access scopes, and Cloud Identity and Access Management roles.
  • See Logs Viewer for a more detailed discussion of the Logs Viewer.
  • See Exporting logs to learn how to export your log entries to Cloud Storage, BigQuery, and Pub/Sub.
  • See Logging Agent to learn how to collect log entries from your virtual machine instances in Logging.
  • See Audit Logs for your auditing and compliance needs.
  • See Stackdriver Logging API to learn how to read, write, and configure logs from your applications.
Was this page helpful? Let us know how we did:

Send feedback about...

Stackdriver Logging Documentation