Execute a workflow

Executing a workflow runs the current workflow definition associated with the workflow.

You can pass runtime arguments in a workflow execution request and access those arguments using a workflow variable. For more information, see Pass runtime arguments in an execution request.

After a workflow execution completes, its history and results are retained for a limited time. For more information, see Quotas and limits.

Before you begin

Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  5. Make sure that billing is enabled for your Google Cloud project.

  6. If a workflow accesses other Google Cloud resources, make sure it is associated with a service account that has the correct permissions to do so. To learn what service account is associated with an existing workflow, see Verify a workflow's associated service account.

    Note that to create a resource and attach a service account, you need permissions to create that resource and to impersonate the service account that you will attach to the resource. For more information, see Service account permissions.

  7. Deploy a workflow using the Google Cloud console or the Google Cloud CLI.

Execute a workflow

You can execute a workflow using the client libraries, in the Google Cloud console, using the gcloud CLI, or by sending a request to the Workflows REST API.

Console

  1. To execute a workflow, in the Google Cloud console, go to the Workflows page:

    Go to Workflows

  2. On the Workflows page, select a workflow to go to its details page.

  3. On the Workflow details page, click Execute.

  4. On the Execute workflow page, in the Input pane, you can enter optional runtime arguments to pass to your workflow before execution. Arguments must be in JSON format; for example, {"animal":"cat"}. If your workflow doesn't use runtime arguments, leave this blank.

  5. Optionally, specify the level of call logging that you want to apply to the execution of the workflow. In the Call log level list, select one of:

    • Not specified: no logging level is specified. This is the default. An execution log level takes precedence over any workflow log level, unless the execution log level is not specified (the default); in that case, the workflow log level applies.
    • Errors only: log all caught exceptions; or when a call is stopped due to an exception.
    • All calls: log all calls to subworkflows or library functions and their results.
    • No logs: no call logging.

  6. Click Execute.

  7. On the Execution details page, you can view the results of the execution including any output, the execution ID and state, and the current or final step of the workflow execution. For more information, see Access workflow execution results.

gcloud

  1. Open a terminal.

  2. Find the name of the workflow you want to execute. If you don't know the workflow's name, you can enter the following command to list all your workflows:

    gcloud workflows list
    
  3. You can execute the workflow using either the gcloud workflows run command or the gcloud workflows execute command:

    • Execute the workflow and wait for the execution to complete:

      gcloud workflows run WORKFLOW_NAME \
          --call-log-level=CALL_LOGGING_LEVEL \
          --data=DATA
      
    • Execute the workflow without waiting for the execution attempt to finish:

      gcloud workflows execute WORKFLOW_NAME \
          --call-log-level=CALL_LOGGING_LEVEL \
          --data=DATA
      

      Replace the following:

      • WORKFLOW_NAME: the name of the workflow.
      • CALL_LOGGING_LEVEL (optional): level of call logging to apply during execution. Can be one of:

        • none: no logging level is specified. This is the default. An execution log level takes precedence over any workflow log level, unless the execution log level is not specified (the default); in that case, the workflow log level applies.
        • log-errors-only: log all caught exceptions; or when a call is stopped due to an exception.
        • log-all-calls: log all calls to subworkflows or library functions and their results.
        • log-none: no call logging.
      • DATA (optional): runtime arguments for your workflow in JSON format.

  4. If you ran gcloud workflows execute, the unique ID of the workflow execution attempt is returned and the output is similar to the following:

     To view the workflow status, you can use following command:
     gcloud workflows executions describe b113b589-8eff-4968-b830-8d35696f0b33 --workflow workflow-2 --location us-central1

    To view the status of the execution, enter the command returned by the previous step.

If the execution attempt is successful, the output is similar to the following, with a state indicating the workflow's success, and a status that specifies the final workflow step of the execution.

argument: '{"searchTerm":"Friday"}'
endTime: '2022-06-22T12:17:53.086073678Z'
name: projects/1051295516635/locations/us-central1/workflows/myFirstWorkflow/executions/c4dffd1f-13db-46a0-8a4a-ee39c144cb96
result: '["Friday","Friday the 13th (franchise)","Friday Night Lights (TV series)","Friday
    the 13th (1980 film)","Friday the 13th","Friday the 13th (2009 film)","Friday the
    13th Part III","Friday the 13th Part 2","Friday (Rebecca Black song)","Friday Night
    Lights (film)"]'
startTime: '2022-06-22T12:17:52.799387653Z'
state: SUCCEEDED
status:
    currentSteps:
    - routine: main
        step: returnOutput
workflowRevisionId: 000001-ac2

Client libraries

The following samples assume you have already deployed a workflow, myFirstWorkflow.

  1. Install the client library and set up your development environment. For details, see the Workflows client libraries overview.

  2. Clone the sample app repository to your local machine:

    Java

    git clone https://github.com/GoogleCloudPlatform/java-docs-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

    Node.js

    git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

    Python

    git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

  3. Change to the directory that contains the Workflows sample code:

    Java

    cd java-docs-samples/workflows/cloud-client/

    Node.js

    cd nodejs-docs-samples/workflows/quickstart/

    Python

    cd python-docs-samples/workflows/cloud-client/

  4. Take a look at the sample code:

    Java

    // Imports the Google Cloud client library
    
    import com.google.cloud.workflows.executions.v1.CreateExecutionRequest;
    import com.google.cloud.workflows.executions.v1.Execution;
    import com.google.cloud.workflows.executions.v1.ExecutionsClient;
    import com.google.cloud.workflows.executions.v1.WorkflowName;
    import java.io.IOException;
    import java.util.concurrent.ExecutionException;
    
    public class WorkflowsQuickstart {
    
      private static final String PROJECT = System.getenv("GOOGLE_CLOUD_PROJECT");
      private static final String LOCATION = System.getenv().getOrDefault("LOCATION", "us-central1");
      private static final String WORKFLOW =
          System.getenv().getOrDefault("WORKFLOW", "myFirstWorkflow");
    
      public static void main(String... args)
          throws IOException, InterruptedException, ExecutionException {
        if (PROJECT == null) {
          throw new IllegalArgumentException(
              "Environment variable 'GOOGLE_CLOUD_PROJECT' is required to run this quickstart.");
        }
        workflowsQuickstart(PROJECT, LOCATION, WORKFLOW);
      }
    
      private static volatile boolean finished;
    
      public static void workflowsQuickstart(String projectId, String location, String workflow)
          throws IOException, InterruptedException, ExecutionException {
        // Initialize client that will be used to send requests. This client only needs
        // to be created once, and can be reused for multiple requests. After completing all of your
        // requests, call the "close" method on the client to safely clean up any remaining background
        // resources.
        try (ExecutionsClient executionsClient = ExecutionsClient.create()) {
          // Construct the fully qualified location path.
          WorkflowName parent = WorkflowName.of(projectId, location, workflow);
    
          // Creates the execution object.
          CreateExecutionRequest request =
              CreateExecutionRequest.newBuilder()
                  .setParent(parent.toString())
                  .setExecution(Execution.newBuilder().build())
                  .build();
          Execution response = executionsClient.createExecution(request);
    
          String executionName = response.getName();
          System.out.printf("Created execution: %s%n", executionName);
    
          long backoffTime = 0;
          long backoffDelay = 1_000; // Start wait with delay of 1,000 ms
          final long backoffTimeout = 10 * 60 * 1_000; // Time out at 10 minutes
          System.out.println("Poll for results...");
    
          // Wait for execution to finish, then print results.
          while (!finished && backoffTime < backoffTimeout) {
            Execution execution = executionsClient.getExecution(executionName);
            finished = execution.getState() != Execution.State.ACTIVE;
    
            // If we haven't seen the results yet, wait.
            if (!finished) {
              System.out.println("- Waiting for results");
              Thread.sleep(backoffDelay);
              backoffTime += backoffDelay;
              backoffDelay *= 2; // Double the delay to provide exponential backoff.
            } else {
              System.out.println("Execution finished with state: " + execution.getState().name());
              System.out.println("Execution results: " + execution.getResult());
            }
          }
        }
      }
    }

    Node.js (JavaScript)

    const {ExecutionsClient} = require('@google-cloud/workflows');
    const client = new ExecutionsClient();
    /**
     * TODO(developer): Uncomment these variables before running the sample.
     */
    // const projectId = 'my-project';
    // const location = 'us-central1';
    // const workflow = 'myFirstWorkflow';
    // const searchTerm = '';
    
    /**
     * Executes a Workflow and waits for the results with exponential backoff.
     * @param {string} projectId The Google Cloud Project containing the workflow
     * @param {string} location The workflow location
     * @param {string} workflow The workflow name
     * @param {string} searchTerm Optional search term to pass to the Workflow as a runtime argument
     */
    async function executeWorkflow(projectId, location, workflow, searchTerm) {
      /**
       * Sleeps the process N number of milliseconds.
       * @param {Number} ms The number of milliseconds to sleep.
       */
      function sleep(ms) {
        return new Promise(resolve => {
          setTimeout(resolve, ms);
        });
      }
      const runtimeArgs = searchTerm ? {searchTerm: searchTerm} : {};
      // Execute workflow
      try {
        const createExecutionRes = await client.createExecution({
          parent: client.workflowPath(projectId, location, workflow),
          execution: {
            // Runtime arguments can be passed as a JSON string
            argument: JSON.stringify(runtimeArgs),
          },
        });
        const executionName = createExecutionRes[0].name;
        console.log(`Created execution: ${executionName}`);
    
        // Wait for execution to finish, then print results.
        let executionFinished = false;
        let backoffDelay = 1000; // Start wait with delay of 1,000 ms
        console.log('Poll every second for result...');
        while (!executionFinished) {
          const [execution] = await client.getExecution({
            name: executionName,
          });
          executionFinished = execution.state !== 'ACTIVE';
    
          // If we haven't seen the result yet, wait a second.
          if (!executionFinished) {
            console.log('- Waiting for results...');
            await sleep(backoffDelay);
            backoffDelay *= 2; // Double the delay to provide exponential backoff.
          } else {
            console.log(`Execution finished with state: ${execution.state}`);
            console.log(execution.result);
            return execution.result;
          }
        }
      } catch (e) {
        console.error(`Error executing workflow: ${e}`);
      }
    }
    
    executeWorkflow(projectId, location, workflowName, searchTerm).catch(err => {
      console.error(err.message);
      process.exitCode = 1;
    });
    

    Node.js (TypeScript)

    import {ExecutionsClient} from '@google-cloud/workflows';
    const client: ExecutionsClient = new ExecutionsClient();
    /**
     * TODO(developer): Uncomment these variables before running the sample.
     */
    // const projectId = 'my-project';
    // const location = 'us-central1';
    // const workflow = 'myFirstWorkflow';
    // const searchTerm = '';
    
    /**
     * Executes a Workflow and waits for the results with exponential backoff.
     * @param {string} projectId The Google Cloud Project containing the workflow
     * @param {string} location The workflow location
     * @param {string} workflow The workflow name
     * @param {string} searchTerm Optional search term to pass to the Workflow as a runtime argument
     */
    async function executeWorkflow(
      projectId: string,
      location: string,
      workflow: string,
      searchTerm: string
    ) {
      /**
       * Sleeps the process N number of milliseconds.
       * @param {Number} ms The number of milliseconds to sleep.
       */
      function sleep(ms: number): Promise<unknown> {
        return new Promise(resolve => {
          setTimeout(resolve, ms);
        });
      }
      const runtimeArgs = searchTerm ? {searchTerm: searchTerm} : {};
      // Execute workflow
      try {
        const createExecutionRes = await client.createExecution({
          parent: client.workflowPath(projectId, location, workflow),
          execution: {
            // Runtime arguments can be passed as a JSON string
            argument: JSON.stringify(runtimeArgs),
          },
        });
        const executionName = createExecutionRes[0].name;
        console.log(`Created execution: ${executionName}`);
    
        // Wait for execution to finish, then print results.
        let executionFinished = false;
        let backoffDelay = 1000; // Start wait with delay of 1,000 ms
        console.log('Poll every second for result...');
        while (!executionFinished) {
          const [execution] = await client.getExecution({
            name: executionName,
          });
          executionFinished = execution.state !== 'ACTIVE';
    
          // If we haven't seen the result yet, wait a second.
          if (!executionFinished) {
            console.log('- Waiting for results...');
            await sleep(backoffDelay);
            backoffDelay *= 2; // Double the delay to provide exponential backoff.
          } else {
            console.log(`Execution finished with state: ${execution.state}`);
            console.log(execution.result);
            return execution.result;
          }
        }
      } catch (e) {
        console.error(`Error executing workflow: ${e}`);
      }
    }
    
    executeWorkflow(projectId, location, workflowName, searchTerm).catch(
      (err: Error) => {
        console.error(err.message);
        process.exitCode = 1;
      }
    );

    Python

    import time
    
    from google.cloud import workflows_v1
    from google.cloud.workflows import executions_v1
    from google.cloud.workflows.executions_v1 import Execution
    from google.cloud.workflows.executions_v1.types import executions
    
    
    def execute_workflow(
        project: str, location: str = "us-central1", workflow: str = "myFirstWorkflow"
    ) -> Execution:
        """Execute a workflow and print the execution results.
    
        A workflow consists of a series of steps described using the Workflows syntax, and can be written in either YAML or JSON.
    
        Args:
            project: The Google Cloud project id which contains the workflow to execute.
            location: The location for the workflow
            workflow: The ID of the workflow to execute.
    
        Returns:
            The execution response.
        """
        # Set up API clients.
        execution_client = executions_v1.ExecutionsClient()
        workflows_client = workflows_v1.WorkflowsClient()
        # Construct the fully qualified location path.
        parent = workflows_client.workflow_path(project, location, workflow)
    
        # Execute the workflow.
        response = execution_client.create_execution(request={"parent": parent})
        print(f"Created execution: {response.name}")
    
        # Wait for execution to finish, then print results.
        execution_finished = False
        backoff_delay = 1  # Start wait with delay of 1 second
        print("Poll for result...")
        while not execution_finished:
            execution = execution_client.get_execution(request={"name": response.name})
            execution_finished = execution.state != executions.Execution.State.ACTIVE
    
            # If we haven't seen the result yet, wait a second.
            if not execution_finished:
                print("- Waiting for results...")
                time.sleep(backoff_delay)
                # Double the delay to provide exponential backoff.
                backoff_delay *= 2
            else:
                print(f"Execution finished with state: {execution.state.name}")
                print(f"Execution results: {execution.result}")
                return execution
    
    

    The sample does the following:

    1. Sets up the Cloud Client Libraries for Workflows.
    2. Executes a workflow.
    3. Polls the workflow's execution (using exponential backoff) until the execution terminates.
    4. Prints the execution results.
  5. To run the sample, first install dependencies:

    Java

    mvn compile

    Node.js (JavaScript)

    npm install

    Node.js (TypeScript)

    npm install && npm run build

    Python

    pip3 install -r requirements.txt

  6. Run the script:

    Java

    GOOGLE_CLOUD_PROJECT=PROJECT_ID LOCATION=CLOUD_REGION WORKFLOW=WORKFLOW_NAME mvn compile exec:java -Dexec.mainClass=com.example.workflows.WorkflowsQuickstart

    Node.js (JavaScript)

    npm start PROJECT_ID CLOUD_REGION WORKFLOW_NAME

    Node.js (TypeScript)

    npm start PROJECT_ID CLOUD_REGION WORKFLOW_NAME

    Python

    GOOGLE_CLOUD_PROJECT=PROJECT_ID LOCATION=CLOUD_REGION WORKFLOW=WORKFLOW_NAME python3 main.py

    Replace the following:

    • PROJECT_ID (required): the Project ID of the Google Cloud project
    • CLOUD_REGION: the location for the workflow (default: us-central1)
    • WORKFLOW_NAME: the ID of the workflow (default: myFirstWorkflow)

    The output is similar to the following:

    Execution finished with state: SUCCEEDED
    ["Sunday","Sunday in the Park with George","Sunday shopping","Sunday Bloody Sunday","Sunday Times Golden Globe Race","Sunday All Stars","Sunday Night (South Korean TV series)","Sunday Silence","Sunday Without God","Sunday Independent (Ireland)"]
    

REST API

To create a new execution using the latest revision of a given workflow, use the projects.locations.workflows.executions.create method.

Note that to authenticate, you will need a service account with sufficient privileges to execute the workflow. For example, you can grant a service account the Workflows Invoker role (roles/workflows.invoker) so that the account has permission to trigger your workflow execution. For more information, see Invoke Workflows.

Before using any of the request data, make the following replacements:

  • PROJECT_NUMBER: your Google Cloud project number listed in the IAM & Admin Settings page.
  • LOCATION: the region in which the workflow is deployed—for example, us-central1.
  • WORKFLOW_NAME: the user-defined name for the workflow—for example, myFirstWorkflow.
  • PARAMETER: optional. If the workflow you are executing can receive runtime arguments that you pass it as part of an execution request, you can add to the request body a JSON-formatted string whose value is one or more escaped parameter-value pairs—for example, "{\"searchTerm\":\"asia\"}".
  • VALUE: optional. The value of a parameter-value pair that your workflow can receive as a runtime argument.
  • CALL_LOGGING_LEVEL: optional. The call logging level to apply during execution. The default is that no logging level is specified and the workflow log level applies instead. For more information, see Send Logs to Logging. One of the following:
    • CALL_LOG_LEVEL_UNSPECIFIED: no logging level is specified and the workflow log level applies instead. This is the default. Otherwise, the execution log level applies and takes precedence over the workflow log level.
    • LOG_ERRORS_ONLY: log all caught exceptions; or when a call is stopped due to an exception.
    • LOG_ALL_CALLS: log all calls to subworkflows or library functions and their results.
    • LOG_NONE: no call logging.
  • BACKLOG_EXECUTION: optional. If set to true, the execution is not backlogged when the concurrency quota is exhausted. For more information, see Manage execution backlogging.

Request JSON body:

{
  "argument": "{\"PARAMETER\":\"VALUE\"}",
  "callLogLevel": "CALL_LOGGING_LEVEL",
  "disableConcurrencyQuotaOverflowBuffering": "BACKLOG_EXECUTION"
}

To send your request, expand one of these options:

If successful, the response body contains a newly created instance of Execution:

{
  "name": "projects/PROJECT_NUMBER/locations/LOCATION/workflows/WORKFLOW_NAME/executions/EXECUTION_ID",
  "startTime": "2023-11-07T14:35:27.215337069Z",
  "state": "ACTIVE",
  "argument": "{\"PARAMETER\":\"VALUE\"}",
  "workflowRevisionId": "000001-2df",
  "callLogLevel": "CALL_LOGGING_LEVEL",
  "status": {}
}

Check the status of executions

There are several commands to help you check the status of a workflow execution.

  • To retrieve a list of a workflow's execution attempts and their IDs, enter the following command:

    gcloud workflows executions list WORKFLOW_NAME
    

    Replace WORKFLOW_NAME with the name of the workflow.

    The command returns a NAME value that is similar to the following:

    projects/PROJECT_NUMBER/locations/REGION/workflows/WORKFLOW_NAME/executions/EXECUTION_ID

    Copy the execution ID to use in the next command.

  • To check the status of an execution attempt and wait for the attempt to finish, enter the following command:

    gcloud workflows executions wait EXECUTION_ID
    

    Replace EXECUTION_ID with the execution attempt's ID.

    The command waits for the execution attempt to finish and then returns the results.

  • To wait until the last execution is complete and then return the result of the completed execution, enter the following command:

    gcloud workflows executions wait-last
    

    If you made a previous execution attempt in the same gcloud session, the command waits for the prior execution attempt to finish and then returns the results of the completed execution. If no previous attempt exists, gcloud returns the following error:

    ERROR: (gcloud.workflows.executions.wait-last) [NOT FOUND] There are no cached executions available.
    
  • To get the status of the last execution, enter the following command:

    gcloud workflows executions describe-last
    

    If you made a previous execution attempt in the same gcloud session, the command returns the results of the last execution even if it is running. If no previous attempt exists, gcloud returns the following error:

    ERROR: (gcloud.beta.workflows.executions.describe-last) [NOT FOUND] There are no cached executions available.
    

Filter executions

You can apply filters to the list of workflow executions returned by the workflows.executions.list method.

You can filter on the following fields:

  • createTime
  • disableOverflowBuffering
  • duration
  • endTime
  • executionId
  • label
  • startTime
  • state
  • stepName
  • workflowRevisionId

For example, to filter on a label (labels."fruit":"apple"), you can make an API request similar to the following:

GET https://workflowexecutions.googleapis.com/v1/projects/MY_PROJECT/locations/MY_LOCATION/workflows/MY_WORKFLOW/executions?view=full&filter=labels.%22fruit%22%3A%22apple%22"

Where:

  • view=full specifies a view defining which fields should be filled in the returned executions; in this case, all the data
  • labels.%22fruit%22%3A%22apple%22is the URL-encoded filter syntax

For more information, see AIP-160 Filtering.

Manage execution backlogging

You can use execution backlogging to avoid client-side retries, remove execution delays, and maximize throughput. Backlogged executions automatically run as soon as execution concurrency quota becomes available.

There is a maximum number of active workflow executions that can run concurrently. Once this quota is exhausted, and if execution backlogging is disabled, or if the quota for backlogged executions is reached, any new executions fail with an HTTP 429 Too many requests status code. With execution backlogging enabled, the new executions succeed and are created in a QUEUED state. As soon as execution concurrency quota becomes available, the executions automatically run and enter an ACTIVE state.

By default, execution backlogging is enabled for all requests (including those triggered by Cloud Tasks) with the following exceptions:

  • When creating an execution using an executions.run or executions.create connector in a workflow, execution backlogging is disabled by default. You can configure it by explicitly setting the execution's disableConcurrencyQuotaOverflowBuffering field to false.
  • For executions triggered by Pub/Sub, execution backlogging is disabled and can't be configured.

Note the following:

  • Queued executions are started in a first-in-first-out (FIFO) order, on a best-effort basis.
  • A createTime timestamp field indicates when an execution is created. The startTime timestamp indicates when an execution is automatically popped from the backlog queue and starts running. For executions that are not backlogged, both timestamp values are identical.
  • The limit for backlogged executions can be observed using the workflowexecutions.googleapis.com/executionbacklogentries quota metric. For more information, see View and manage quotas.

Disable execution backlogging

You can disable execution backlogging by setting a flag when using the the Google Cloud CLI. For example:

gcloud workflows execute WORKFLOW_NAME
    --disable-concurrency-quota-overflow-buffering

Or, you can disable execution backlogging by setting the disableConcurrencyQuotaOverflowBuffering field to true in the request JSON body when sending an execution request to the Workflows REST API. For example:

{
  "argument": {"arg1":"value1"},
  "callLogLevel": "LOG_NONE",
  "disableConcurrencyQuotaOverflowBuffering": true
}

For more information, see Execute a workflow.

What's next