Using Cloud Logging as a logging server for dedicated game servers

This tutorial outlines how to use Cloud Logging instead of an on-premises logging server for application-specific logging. By default, Cloud Logging aggregates logs from the system and from many common applications. This tutorial outlines how to deliver logs to Cloud Logging from custom workflows, or from applications that don't appear on the common applications list.

This tutorial focuses on dedicated game server (DGS) instances and uses the Unreal Tournament (UT) server binary from Epic Games as an example. In a typical use case, a virtual machine (VM) is created by a provisioning system connected to the game's platform services to run a DGS ahead of player demand. The DGS generates logs, and having them centrally stored and indexed provides significant advantages when identifying and debugging issues.

Objectives

  • Create a DGS instance and install the Cloud Logging agent on the instance.
  • Configure your DGS or workflow to send logs to the Cloud Logging agent.
  • Send logs directly to Cloud Logging using the Python client library.
  • View, filter, and search logs in Cloud Logging.
  • Export logs from Cloud Logging into long-term accessible storage.

Costs

This tutorial uses the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

Before you begin

  1. In the Google Cloud Console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  2. Make sure that billing is enabled for your Cloud project. Learn how to confirm that billing is enabled for your project.

  3. Enable the Compute Engine API.

    Enable the API

  4. Install and initialize the Cloud SDK.
  5. Install the gcloud beta commands component:

    gcloud components install beta
  6. Set your default project so you don't have to supply the --project flag with each command:

    gcloud config set project PROJECT_ID

Creating a Compute Engine instance

The Logging agent works on Compute Engine virtual machine (VM) and Amazon EC2 VM instances. For more information about the agent and the VM instances it supports, see the Logging agent documentation. For the purposes of this tutorial, you can create an instance with an n1-standard-8 VM type. In a production system, you must decide how many DGS instances that you plan to run on each VM, and choose a machine type accordingly.

  1. In the Cloud Console, go to the Create an instance page.

    Go to Create an instance

  2. On the Create a new instance page, fill in the properties for your instance. For advanced configuration options, expand Management, security, disks, networking, sole tenancy.
  3. To create the VM, click Create.

It takes a few moments to create your new instance.

Configuring the Compute Engine instance

After you've used an account that has superuser permissions, to create the VM instance, complete these steps from the home directory:

  1. In your terminal, use SSH to connect to the instance:

    gcloud compute ssh sd-tutorial
    
  2. Install the Logging agent.

  3. Download and install pip:

    sudo yum install python-pip
    
  4. Download and install the Cloud Logging Python library:

    sudo pip install --upgrade google-cloud-logging
    
  5. In the /etc/google-fluentd/config.d directory, create the ut.conf Logging agent configuration file for Unreal Tournament (UT) with the following contents:

    <source>
      @type tail
      read_from_head true
      format /^(\[(?<time>.*?):\d{3}\])?(\[[
    \d]+\])?(?<message>((?<component>\S*?):)((?<severity>\S*?):)?.*) /
      time_format %Y.%m.%d-%H.%M.%S
    
    # Log file names must be of the format ${ENV}.${VER}.${PORT}.log.
    # For example: prod.0112.8001.log
    path /home/*/LinuxServer/UnrealTournament/Saved/Logs/*.log
    exclude_path /home/*/LinuxServer/UnrealTournament/Saved/Logs/*backup*.log
    pos_file /var/lib/google-fluentd/pos/ut.pos
    tag ut.*
    </source>
    
    <filter ut.**>
      @type record_transformer
      <record>
        # Parse the log file name and add additional key:value records
        # to aid in filtering and searching logs.
        # Assumes you are following the convention in this tutorial.
       environment ${tag_parts[-4]}
       version ${tag_parts[-3]}
       port ${tag_parts[-2]}
       tag ${tag_suffix[1]} # Strip off the "ut." prefix.
    </record>
    </filter>
    

    For more information on fluentd configuration, see Configuration File Syntax.

  6. Reload the Logging agent configuration:

    sudo service google-fluentd reload
    

This tutorial assumes that the UT build is in the user's home directory, in its standard directory structure. For more information, see the Linux server setup portion of the UT Wiki.

In a production setting, we recommend that you save this configured VM's boot disk as a custom VM image that your pipeline could start on demand.

Considerations when logging to Cloud Logging from a dedicated game server

Each dedicated game server generates its own log output with its own format. Although this tutorial uses the UT DGS, you can adapt the tutorial to other engines or applications that output to stdout/stderr standard stream.

This tutorial uses a single fluentd configuration for all DGS instances and expects your provisioning system to redirect the output of each DGS instance to a filename with a specific, search-friendly format. If you run multiple DGS instances on a single VM, you need to use unique filenames. The scripts in this tutorial, discussed in the following section, impose log file naming restrictions, but none of these restrictions apply to Cloud Logging in general.

By default, UT automatically creates backups of log files in the logs directory, which the Logging agent reads as new files. We recommend that you disable this in production to avoid duplicate logs in Cloud Logging.

Naming your dedicated game server log files

When you define the naming convention for your DGS logs, follow the best practices or naming conventions in use at your studio. Cloud Logging can search across resources, so using a consistent naming convention helps ensure that you can search for logs generated from different DGS instances that are tagged with the same or similar data.

In this tutorial, the provisioning service populates the values in the following table into environment variables before starting the render worker process, which are then used to tag the logs and generate a unique filename.

Field name Environment variable Example values
Environment ENV prod, qa, dev
Version VER 0112
Port PORT 8001

This example assembles these values into a DGS naming convention:

${ENV}.${VER}.${PORT}.log

The logs for a production DGS instance on game version 0.1.12 would be tagged as follows:

prod.0112.8001.log

This tutorial expects your instance-management process or operations team to set the log filename according to this convention, but modifying the code provided to fit your studio's needs is straightforward.

Testing the configuration manually

  • To check your configuration without running a DGS, from your home directory, copy a sample log entry into a test log:

    mkdir -p LinuxServer/UnrealTournament/Saved/Logs
    export ENV='testenv' VER='testver' PORT='testport'
    echo "testlogger:debug: Test log line at `date` from ${HOSTNAME}" >>
    LinuxServer/UnrealTournament/Saved/Logs/${ENV}.${VER}.${PORT}.log 

The testlogger line should appear soon in the log viewer.

Verify log delivery

To confirm that the log was delivered to Cloud Logging, perform the following steps:

  1. In the Cloud Console, go to the Logs Viewer page.

    Go to the Logs Viewer page

    In the logs list is an entry tagged with the log name you created (in this case, PATH.testenv.testver.testid`).

  2. To see your output, click the log:

    Log entry in the Logs Viewer page.

For more information about gcloud logging, see the log entry documentation.

Send logs from a dedicated game server

With the configuration set up correctly and verified, you can use a UT DGS command to send logs to Cloud Logging.

  • In your terminal, start the DGS instance and send log output to a properly named file:

    ~/LinuxServer/Engine/Binaries/Linux/UE4Server-Linux-Shipping UnrealTournament \
    UT-Entry?Game=Lobby \
    -log=${ENV}.${VER}.${PORT}.log 

In this example, the provisioning layer populates the environment variables (ENV, VER, and DGSID) that are used to generate the name of the log, as per the Naming your log files section.

Logging directly to the API from applications and workflows

Most DGS deployment pipelines use some sort of scripting language to perform programmatic tasks such as asset preparation, builds, data transfer, or instance management. You can log the output from these tasks to Cloud Logging using a client library. This tutorial uses Python because it's widely used in the games industry.

You can send logs to Cloud Logging from both on-premises and cloud-based workstations by using the Cloud Logging API. When transmitting logs in this way, you don't have to install the Logging agent because you are communicating with Cloud Logging through its Python API.

Write a log to Cloud Logging using the Python library

To log from a Python script to Cloud Logging, you have to do the following:

  • Build the log metadata.
  • Provide the severity level.
  • Decide what type of resource to log to.

To write a log to Cloud Logging, do the following:

  1. Download the Python script from GitHub, for it to perform the following tasks:

    • Check to ensure the use of proper naming conventions.
    • Assemble the log data.
    • Write the log at the Cloud project resource level.
  2. If you are logging from an on-premises workstation or server, you must authenticate prior to logging to Cloud Logging. If you are logging from a cloud instance, authentication has already been performed and you can skip this step.

    #!/usr/bin/env python
    
    from google.cloud import logging
    from google.cloud.logging.resource import Resource
    import getpass
    import os
    import socket
    
    def write(text, severity='INFO', env=None, port=None, ver=None, **kwargs):
        '''Wrapper method for assembling the payload to send to logger.log_text.'''
    
        # Extract and build log id from environment.
        # For example: 'prod.0112.8001'
        if not env:
            env = os.getenv('ENV')
        if not port:
            port = os.getenv('port')
        if not ver:
            ver = os.getenv('VER')
    
        if not env or not port or not ver:
            raise Exception('One or more log name tokens are empty. Unable to log.')
        # end if
    
        # Assemble logger name.
        logger_name = '.'.join([
            env,
            port,
            ver
        ])
    
        print '# Logging to %s...' % logger_name
    
        # Build logger object.
        logging_client = logging.Client()
        logger = logging_client.logger(logger_name)
    
        # Assemble the required log metadata.
        label_payload = {
            "username" : getpass.getuser(),
            "hostname" : socket.gethostname(),
            "env" : env,
            "ver" : ver,
            "port" : port
        }
    
        # Add optional kwargs to payload.
        label_payload.update(kwargs)
    
        # Write log.
        logger.log_text(
            text,
            resource=Resource(type='project', labels={'project_id':show}),
            severity=severity,
            labels=label_payload
        )
    
    # end write
    
  3. You can import the module into any Python script in your pipeline and run it on either an on-premises artist workstation or a cloud instance:

    import logToStackdriver as lts
    lts.write( 'This is the text to log.', env='qa', ver='0117', port='8000')
    
  4. To view the logs, in the Cloud Console, go to the Logs Viewer page.

    Go to the Logs Viewer page

    By default, all logs write to the project resource.

Routing logs

If you want to keep your logs beyond the Logs retention periods, you should route them using one of the following methods:

  • To perform big data analysis on them, export to a BigQuery dataset.
  • For inexpensive, long-term storage, export your logs to Cloud Storage buckets.

In either case, you first create an object called a sink. The sink lets you create a filter that selects the log entries that you want to route, and the option to choose Cloud Storage or BigQuery as the destination. When you create a sink, the logs that you select are exported to the destination that you select.

You can also route logs in the Logs Viewer by using the Cloud Logging API, or directly using the gcloud logging command-line tool.

Export to BigQuery

You can query logs stored in BigQuery with SQL semantics. Many third-party analytics tools are integrated with BigQuery.

To export logs to BigQuery, do the following:

  1. Create a BigQuery dataset and create the sink with a filter to export logs to that table.

       gcloud beta logging sinks create dgs-errors-bq  \
        bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET \
        --log-filter "jsonPayload.environment=dev AND \
        jsonPayload.version=0112 AND severity>=WARNING"
    

    Replace the following:

    • PROJECT_ID: your Cloud project
    • DATASET: your new BigQuery dataset
  2. Add the service account name to your BigQuery dataset.

    The next log sent to Cloud Logging that matches this filter is exported to the dataset, after a slight delay. For more information, see how sinks work documentation.

Export to Cloud Storage

To save your logs to a file, export them to a Cloud Storage bucket. For Cloud Storage buckets, you can select a storage class with lower prices for less frequently accessed files, or take advantage of the free usage allowance.

Cloud Storage provides easy access either by HTTP or by direct integration with many other Google Cloud products.

To export logs to Cloud Storage, do the following:

  1. Create a Cloud Storage bucket and create a sink with a filter in Cloud Logging to export those logs to Cloud Storage:

    gcloud beta \
       logging sinks create dgs-errors-gcs \
       storage.googleapis.com/BUCKET_NAME \
       --log-filter "jsonPayload.environment=dev AND \
       jsonPayload.version=0112 AND severity>=WARNING"
    

    Replace BUCKET-NAME with a unique name for the Cloud Storage bucket.

  2. Add the service account name to your Cloud Storage bucket.

    The next log sent to Cloud Logging that matches this filter is exported to a file in the bucket, after a slight delay. For more information, see how sinks work.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.

Delete the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Cloud Console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete your Compute Engine instances

To delete a Compute Engine instance:

  1. In the Cloud Console, go to the VM instances page.

    Go to VM instances

  2. Select the checkbox for the instance that you want to delete.
  3. To delete the instance, click More actions, click Delete, and then follow the instructions.

Delete your Cloud Storage bucket

To delete a Cloud Storage bucket:

  1. In the Cloud Console, go to the Cloud Storage Browser page.

    Go to Browser

  2. Click the checkbox for the bucket that you want to delete.
  3. To delete the bucket, click Delete, and then follow the instructions.

What's next

  • Explore reference architectures, diagrams, tutorials, and best practices about Google Cloud. Take a look at our Cloud Architecture Center.