Analyze a job using logs

Stay organized with collections Save and categorize content based on your preferences.

This page describes how to enable and view logs from Cloud Logging for a job.

You can use logs to get information that is useful for analyzing your jobs. For example, logs can help you debug failed jobs. When Cloud Logging is enabled for a job, Cloud Logging generates the following types of logs for you to view:

  • task logs (batch_task_logs): logs for any data written to the standard output (stdout) and standard error (stderr) streams. To generate task logs for your job, configure your tasks to write data for analysis and debugging to these streams.
  • agent logs (batch_agent_logs): logs for activities from the Batch service agent. Batch automatically generates these logs for your job.

Note that Cloud Logging only generate logs after a job starts running. To verify if a job has started running, describe the job and confirm that the job's state is RUNNING or a later state. If you need to analyze a job that did not generate logs, for example because a job failed before the RUNNING state, describe the job using the gcloud CLI or Batch API and check the statusEvents field.

Before you begin

  • If you haven't used Batch before, review Get started with Batch and enable Batch by completing the prerequisites for projects and users.
  • To get the permissions that you need to analyze a job using logs, ask your administrator to grant you the following IAM roles:

    • To create a job with logs enabled: Batch Job Editor (roles/batch.jobsEditor) on the project
    • To view logs: Logs Viewer (roles/logging.viewer) on the project

    For more information about granting roles, see Manage access.

Enable logs for a job

To generate logs for a job, enable Cloud Logging when you create the job:

  • If you create a job using the Google Cloud console, Cloud Logging is always enabled.
  • If you create a job using the gcloud CLI or the Batch API, Cloud Logging is disabled by default. To enable Cloud Logging, include the following configuration for the logsPolicy field while creating the job:

        "logsPolicy": {
            "destination": "CLOUD_LOGGING"

View logs for a job

You can view a job's logs in Cloud Logging using the Google Cloud console, gcloud CLI, Logging API, Go, Java, or Python.


To view a job's logs using the Google Cloud console, do the following:

  1. In the Google Cloud console, go to the Job list page.

    Go to Job list

  2. In the Job name column, click the name of a job. The Job details page opens.

  3. Click the Events tab.

  4. In the Logs section, click  Cloud Logging. The Logs Explorer page opens.

    By default, the Logs Explorer displays all the task logs for this job.

    Recommended: To filter which logs are displayed, build queries; for example, enter a query for Batch logs in the query-editor field.


To view logs using the gcloud CLI, use the gcloud logging read command:

gcloud logging read "QUERY"

where QUERY is a query for Batch logs.


To view logs using the Logging API, use the entries.list method:

    "resourceNames": [
    "filter": "QUERY"
    "orderBy": "timestamp desc"

Replace the following:



For more information, see the Batch Go API reference documentation.

import (

	batch ""
	batchpb ""

// Retrieve the logs written by the given job to Cloud Logging
func printJobLogs(w io.Writer, projectID string, job *batchpb.Job) error {
	// projectID := "your_project_id"

	ctx := context.Background()
	batchClient, err := batch.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("NewClient: %v", err)
	defer batchClient.Close()

	adminClient, err := logadmin.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("Failed to create logadmin client: %v", err)
	defer adminClient.Close()

	const name = "batch_task_logs"

	iter := adminClient.Entries(ctx,
		// Only get entries from the "batch_task_logs" log for the job with the given UID
		logadmin.Filter(fmt.Sprintf(`logName = "projects/%s/logs/%s" AND labels.job_uid=%s`, projectID, name, job.Uid)),

	var entries []*logging.Entry

	for {
		logEntry, err := iter.Next()
		if err == iterator.Done {
		if err != nil {
			return fmt.Errorf("unable to fetch log entry: %v", err)
		entries = append(entries, logEntry)
		fmt.Fprintf(w, "%s\n", logEntry.Payload)

	fmt.Fprintf(w, "Successfully fetched %d log entries\n", len(entries))

	return nil



For more information, see the Batch Java API reference documentation.


public class ReadJobLogs {

  public static void main(String[] args) throws IOException {
    // TODO(developer): Replace these variables before running the sample.
    // Project ID or project number of the Cloud project hosting the job.
    String projectId = "YOUR_PROJECT_ID";

    // The job which logs you want to print.
    Job job = Job.newBuilder().build();

    readJobLogs(projectId, job);

  // Prints the log messages created by given job.
  public static void readJobLogs(String projectId, Job job) throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the `loggingClient.close()` method on the client to safely
    // clean up any remaining background resources.
    try (LoggingClient loggingClient = LoggingClient.create()) {

      ListLogEntriesRequest request = ListLogEntriesRequest.newBuilder()
          .addResourceNames(String.format("projects/%s", projectId))
          .setFilter(String.format("labels.job_uid=%s", job.getUid()))

      for (LogEntry logEntry : loggingClient.listLogEntries(request).iterateAll()) {



For more information, see the Batch Python API reference documentation.

from typing import NoReturn

from import batch_v1
from import logging

def print_job_logs(project_id: str, job: batch_v1.Job) -> NoReturn:
    Prints the log messages created by given job.

        project_id: name of the project hosting the job.
        job: the job which logs you want to print.
    # Initialize client that will be used to send requests across threads. This
    # client only needs to be created once, and can be reused for multiple requests.
    log_client = logging.Client(project=project_id)
    logger = log_client.logger("batch_task_logs")

    for log_entry in logger.list_entries(filter_=f"labels.job_uid={job.uid}"):

Write queries to filter for Batch logs

You can filter for Batch logs by writing a query that includes one or more of the following filter parameters and zero or more boolean operators (AND, OR and NOT).

  • To filter for logs from a specific job, specify the job's unique ID (UID):


    where JOB_UID is the UID of the job. To get a job's UID, describe the job.

  • To filter for a specific type of Batch logs, specify the log type:


    Replace the following:

    • PROJECT_ID: the project ID of the project that you want to view logs for.
    • BATCH_LOG_TYPE: the type of Batch logs you want to view, either batch_task_logs for task logs or batch_agent_logs for agent logs.

What's next