List the jobs of a project across all regions (async)

Demonstrates how to list the jobs of a project across all regions, asynchronously.

Code sample

Node.js

/**
 * TODO(developer): Uncomment these variables before running the sample.
 */
/**
 *  The kind of filter to use.
 */
// const filter = {}
/**
 *  The project which owns the jobs.
 */
// const projectId = 'abc123'
/**
 *  If there are many jobs, limit response to at most this many.
 *  The actual number of jobs returned will be the lesser of max_responses
 *  and an unspecified server-defined limit.
 */
// const pageSize = 1234
/**
 *  Set this to the 'next_page_token' field of a previous response
 *  to request additional results in a long list.
 */
// const pageToken = 'abc123'
/**
 *  The regional endpoint 
 *  (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that
 *  contains this job.
 */
// const location = 'abc123'

// Imports the Dataflow library
const {JobsV1Beta3Client} = require('@google-cloud/dataflow').v1beta3;

// Instantiates a client
const dataflowClient = new JobsV1Beta3Client();

async function callAggregatedListJobs() {
  // Construct request
  const request = {
  };

  // Run request
  const iterable = await dataflowClient.aggregatedListJobsAsync(request);
  for await (const response of iterable) {
      console.log(response);
  }
}

callAggregatedListJobs();

Python

from google.cloud import dataflow_v1beta3


async def sample_aggregated_list_jobs():
    # Create a client
    client = dataflow_v1beta3.JobsV1Beta3AsyncClient()

    # Initialize request argument(s)
    request = dataflow_v1beta3.ListJobsRequest(
    )

    # Make the request
    page_result = client.aggregated_list_jobs(request=request)

    # Handle the response
    async for response in page_result:
        print(response)

What's next

To search and filter code samples for other Google Cloud products, see the Google Cloud sample browser.