Using Cloud Logging as a logging server for a VFX rendering pipeline

This tutorial outlines how to use Cloud Logging instead of a logging server for application-specific logging. By default, Cloud Logging aggregates logs from the system and from many common applications. The tutorial also shows how to get logs into Cloud Logging from custom workflows or applications that are not on the common applications list.

Short-lived workers are common in many compute workloads, such as in visual effects (VFX) render pipelines and build systems. This tutorial focuses on VFX workloads and uses the standalone VFX renderer V-Ray as an example. In a typical use case, VM instances are created on-demand by a queueing system, assigned a job (such as one or more frames to render), and terminated after the job is completed. You must capture the logs not only from the render process, but also from any pre- or post-render jobs the instance performs, such as file conversion, or copying frames rendered locally to common storage.


  • Create an instance and install the Cloud Logging agent on the instance.
  • Configure your custom application or workflow to send logs to the Cloud Logging agent.
  • Use the Python client library to send logs directly to Logging.
  • View, filter, and search logs in Logging.
  • Export logs from Logging to long-term accessible storage.


This tutorial uses the following billable components of Google Cloud:

  • Compute Engine

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

See Logging Pricing to understand your costs related to the use of Cloud Logging in this tutorial.

Before you begin

  1. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to the project selector page

  2. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  3. Enable the Compute Engine API.

    Enable the API

  4. Install the gcloud beta commands component:

    gcloud components install beta
  5. Set your default project so you don't have to supply the --project flag with each command:

    gcloud config set project PROJECT_ID

Creating a Compute Engine instance

The Cloud Logging agent works on Compute Engine virtual machine (VM) instances and Amazon Elastic Compute Cloud (Amazon EC2) VM instances. For more information about the agent and the VM instances it supports, see Logging Agent in the product documentation.

For the purposes of this tutorial, you can create an instance with a default VM type. In production, however, you must decide how much computing power your application requires and choose a VM accordingly.

  1. In the Cloud Console, go to the VM instances page.

    Go to VM instances

  2. Click Create instance.
  3. On the Create a new instance page, fill in the properties for your instance. For advanced configuration options, expand the Management, security, disks, networking, sole tenancy section.
  4. Click Create to create the instance.

It takes a few moments to create your new instance. In this tutorial, the VM instance is named sd-tutorial.

Configuring the Compute Engine instance

After you've created the VM instance, using an account that has superuser permissions, complete these steps:

  1. Use SSH to connect to the sd-tutorial instance.

    gcloud compute ssh sd-tutorial
  2. Install the Cloud Logging agent. For detailed instructions, see Installing the Cloud Logging Agent.

  3. Download and install pip:

    sudo yum install python-pip
  4. Download and install the Cloud Logging Python library:

    pip install --user --upgrade google-cloud-logging
  5. Create the Cloud Logging agent config file for V-Ray at /etc/google-fluentd/config.d/vray.conf with the following contents:

      @type tail
      read_from_head true
      format /^(\[(?<time>.*?)\])?((?<severity> ?(debug|info|warning|error|critical)): )?(?<message>.*)$/
      time_format %Y/%b/%d|%H:%M:%S
        # Log file names must be of the format SH.SEQ.SHOT.ROLE.log.
        # For example: myfilm.fba.0050.render.log
      path /home/*/vray/logs/*.log
      pos_file /var/lib/google-fluentd/pos/vray.pos
      tag vray.*
    <filter vray.**>
      @type record_transformer
        # Parse the log file name and add additional key:value records
        # to aid in filtering and searching logs in Logging.
        # Assumes you are following the convention in this tutorial.
        show ${tag_parts[-5]}
        seq ${tag_parts[-4]}
        shot ${tag_parts[-3]}
        role ${tag_parts[-2]}
        tag ${tag_suffix[1]} # Strip off the "vray." prefix.

    For more information on fluentd configuration, see Config File Syntax.

  6. Reload the Cloud Logging agent configuration:

    sudo service google-fluentd reload

    In a production setting, you could next make this configured VM into a custom VM image that your pipeline could start on demand.

Special considerations for VFX

Each rendering software package generates its own log output. Although this tutorial uses the V-Ray standalone renderer, you can adapt the tutorial to other renderers or applications that output to stdout/stderr. This tutorial uses a generalized fluentd configuration and expects your queueing system to redirect the output of the renderer to a filename with a specific, search-friendly format. If you run multiple jobs on a single VM, you will need to ensure unique filenames.

Name your log files in Logging

When you define the naming convention for your Logging logs, follow the best practices for naming conventions in use at your studio. Logging can search across resources, so using a consistent naming convention ensures that you can search for logs generated from different resources that are tagged with the same or similar data. In this tutorial, the queue manager populates the following values into environment variables before starting the render worker process, which are then used to tag the logs and generate a unique filename:

Field name Environment variable Value
Show (Project) Name SHOW myfilm
Sequence Name SEQ fba
Shot Number SHOT 0050
Role ROLE render

This example assembles these values into a naming convention typical for visual effects workflows:


For example, the render logs for shot fba0050 would be tagged as follows:


This tutorial expects the queue manager to set the log filename according to this convention, but modifying it to fit your studio's differing needs is straightforward.

Test the Logging configuration manually

To check your configuration without using a renderer or a queueing manager, copy a sample log entry into a test log. From your home directory, enter the following commands:

mkdir -p vray/logs/
export SHOW='testshow' SEQ='testseq' SHOT='testshot' ROLE='testrole'
echo "debug: Test log line at `date` from ${HOSTNAME}" >> vray/logs/${SHOW}.${SEQ}.${SHOT}.${ROLE}.log

This line should appear shortly in the log viewer.

Verify log delivery

  1. In the Cloud Console, go to the Logs Viewer page.

    Go to the Logs Viewer page

    Under the Logs menu, you should see an entry tagged with the log name you created—in this case, testshow.testseq.testshot.testrole.

  2. View this log to see your output:

    Cloud Logging log viewer

    You can also read logs using the gcloud tool beta command, replacing [project-id] and [log-name] as applicable:

    # List all logs.
    gcloud beta logging logs list
    # Read the contents of a specific log.
    gcloud beta logging read projects/[project-id]/logs/[log-name]

For more information about logging by using the gcloud command-line tool, see the documentation about reading log entries.

Log to Logging from a render process

With the configuration set up correctly and verified, you can use a V-Ray standalone command line like the following to send logs to Logging. In this example, the command-line flags start the V-Ray process with output optimized for redirection to a file. The queueing manager is expected to replace SCENE_FILE with the appropriate local file path. It should also populate the four environment variables (SHOW, SEQ, SHOT, and ROLE) that are used to generate the name of the log, as described in the Name your log files section.

vray \
   -display=0 \
   -showProgress=0 \
   -progressUseCR=0 \
   -progressUseColor=0 \
   -sceneFile SCENE_FILE > vray/logs/${SHOW}.${SEQ}.${SHOT}.${ROLE}.log 2>&1

Logging directly to the Cloud Logging API

Most VFX pipelines use some sort of scripting language to perform programmatic tasks such as asset preparation, publishing, data transfer, rendering, or transcoding. You can log the output from these tasks to Logging using a client library. This tutorial uses Python because it is so widely used in the VFX industry.

You can send logs to Logging from both on-premises and cloud-based workstations. You don't install the Logging agent to write logs this way because you communicate with Logging through its Python API.

Write a log to Cloud Logging using the Python library

To log to Cloud Logging by using a Python script, you must first do the following:

  • Build the log metadata.
  • Provide the severity level.
  • Decide what type of resource to log to.

The script performs the following:

  • Checks to ensure the use of proper naming conventions.
  • Assembles the log data.
  • Writes the log at the Google project resource level.

If you are logging from an on-premises workstation or server, you must authenticate prior to logging to Logging. If you are logging from a cloud instance, authentication is already completed.

# Copyright 2017 Google Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# See the License for the specific language governing permissions and
# limitations under the License.

from import logging
from import Resource
import getpass
import os
import socket

def write(text, severity='INFO', show=None, seq=None, shot=None, role=None, **kwargs):
    '''Wrapper method for assembling the payload to send to logger.log_text.'''

    # Extract and build LOG_ID from environment.
    # For example: 'myfilm.slb.0050.render'
    if not show:
        show = os.getenv('SHOW')
    if not seq:
        seq = os.getenv('SEQ')
    if not shot:
        shot = os.getenv('SHOT')
    if not role:
        role = os.getenv('ROLE')

    if not show or not seq or not shot or not role:
        raise Exception('One or more log name tokens are empty. Unable to log.')
    # end if

    # Assemble logger name.
    logger_name = '.'.join([

    print '# Logging to %s...' % logger_name

    # Build logger object.
    logging_client = logging.Client()
    logger = logging_client.logger(logger_name)

    # Assemble the required log metadata.
    label_payload = {
        "artist" : getpass.getuser(),
        "hostname" : socket.gethostname(),
        "show" : show,
        "seq" : seq,
        "shot" : shot,
        "role" : role

    # Add optional kwargs to payload.

    # Write log.
        resource=Resource(type='project', labels={'project_id':show}),

# end write

You can import the module into any Python script in your pipeline and run it on either an on-premises artist workstation or a cloud instance:

import logToStackdriver as lts
lts.write( 'This is the text to log.', show='myfilm', seq='slb', shot='0050', role='render' )

By default, all logs write to the 'project' resource, which you can find in the Google Cloud Console, under Logging > Logs > Google Project:

Cloud Logging > Logs > Google Project

Exporting logs

If you want to keep your logs beyond the Logs retention period, you should export them.

For inexpensive, long-term storage, export your logs to Compute Engine buckets. To perform big data analysis on them, export to a BigQuery dataset. In either case, you first create an object called a sink. The sink allows you to create a filter that selects the log entries you want to export and choose Compute Engine or BigQuery as the destination. Creation of a sink immediately begins exporting the specified logs to the specified destination. You can export logs in the Logs Viewer, by using the Cloud Logging API, or directly using the gcloud logging command-line tool.

Export to BigQuery

You can use SQL semantics to query logs stored in BigQuery. Many third-party analytics tools natively support these logs. Follow these steps:

  1. Create a BigQuery dataset.

  2. Create the sink with a filter to export logs to that table. Note that [PROJECT_ID] in the following command line refers to your Google Cloud project.

    gcloud beta logging sinks create render-errors-bq  \[PROJECT_ID]/datasets/[DATASET] \
        --log-filter "jsonPayload.SHOW=myfilm AND jsonPayload.SHOT=fba AND severity>=WARNING"
  3. You will receive a message with the service account name to add to your BigQuery dataset. You can use the Web UI, by clicking the dropdown next to a dataset name, and then clicking Share dataset.

The next log sent to Logging that matches this filter will be sent to the dataset, with a slight delay. You can find more details in the How sinks work documentation.

Export to Cloud Storage

To save your logs to a file, export them to a Cloud Storage bucket. For Cloud Storage buckets, you can select a storage class with lower prices for less frequently accessed files, or take advantage of the free usage allowance. Cloud Storage provides easy access either by HTTP, or by direct integration with many other Google Cloud products.

The following steps show how to export to Cloud Storage:

  1. Create a Cloud Storage bucket.
  2. In Logging, create a sink with a filter to export those logs to Cloud Storage.

    gcloud beta logging sinks create render-errors-gcs  \ \
        --log-filter "jsonPayload.SHOW=myfilm AND jsonPayload.SHOT=fba AND severity>=WARNING"

The next log sent to Logging that matches this filter will be sent to a file in the bucket, with a slight delay. You can find more details in the How sinks work documentation.

Cleaning up

To avoid incurring charges to your Google Cloud Platform account for the resources used in this tutorial:

Delete the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Cloud Console, go to the Manage resources page.

    Go to the Manage resources page

  2. In the project list, select the project that you want to delete and then click Delete .
  3. In the dialog, type the project ID and then click Shut down to delete the project.

Delete your Compute Engine instances

To delete a Compute Engine instance:

  1. In the Cloud Console, go to the VM Instances page.

    Go to the VM Instances page

  2. Click the checkbox for the instance you want to delete.
  3. Click Delete to delete the instance.

Delete your Cloud Storage bucket

To delete a Cloud Storage bucket:

  1. In the Cloud Console, go to the Cloud Storage Browser page.

    Go to the Cloud Storage Browser page

  2. Click the checkbox for the bucket you want to delete.
  3. To delete the bucket, click Delete .

What's next