Executing a workflow runs the current workflow definition associated with the workflow.
You can pass runtime arguments in a workflow execution request and access those arguments using a workflow variable. For more information, see Pass runtime arguments in an execution request.
After a workflow execution completes, its history and results are retained for a limited time. For more information, see Quotas and limits.
Before you begin
Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
- If a workflow accesses other Google Cloud resources, make sure it is
associated with a service account that has the correct permissions to do so.
To learn what service account is associated with an existing workflow, see
Verify a
workflow's associated service account.
Note that to create a resource and attach a service account, you need permissions to create that resource and to impersonate the service account that you will attach to the resource. For more information, see Service account permissions.
- Deploy a workflow using the Google Cloud console or the Google Cloud CLI.
Execute a workflow
You can execute a workflow using the client libraries, in the Google Cloud console, using the gcloud CLI, or by sending a request to the Workflows REST API.
Console
To execute a workflow, in the Google Cloud console, go to the Workflows page:
On the Workflows page, select a workflow to go to its details page.
On the Workflow details page, click play_arrow Execute.
On the Execute workflow page, in the Input pane, you can enter optional runtime arguments to pass to your workflow before execution. Arguments must be in JSON format; for example,
{"animal":"cat"}
. If your workflow doesn't use runtime arguments, leave this blank.Optionally, specify the level of call logging that you want to apply to the execution of the workflow. In the Call log level list, select one of:
- Not specified: no logging level is specified. This is the default. An execution log level takes precedence over any workflow log level, unless the execution log level is not specified (the default); in that case, the workflow log level applies.
- Errors only: log all caught exceptions; or when a call is stopped due to an exception.
- All calls: log all calls to subworkflows or library functions and their results.
- No logs: no call logging.
Click Execute.
On the Execution details page, you can view the results of the execution including any output, the execution ID and state, and the current or final step of the workflow execution. For more information, see Access workflow execution results.
gcloud
Open a terminal.
Find the name of the workflow you want to execute. If you don't know the workflow's name, you can enter the following command to list all your workflows:
gcloud workflows list
You can execute the workflow using either the
gcloud workflows run
command or thegcloud workflows execute
command:Execute the workflow and wait for the execution to complete:
gcloud workflows run WORKFLOW_NAME \ --call-log-level=CALL_LOGGING_LEVEL \ --data=DATA
Execute the workflow without waiting for the execution attempt to finish:
gcloud workflows execute WORKFLOW_NAME \ --call-log-level=CALL_LOGGING_LEVEL \ --data=DATA
Replace the following:
WORKFLOW_NAME
: the name of the workflow.CALL_LOGGING_LEVEL
(optional): level of call logging to apply during execution. Can be one of:none
: no logging level is specified. This is the default. An execution log level takes precedence over any workflow log level, unless the execution log level is not specified (the default); in that case, the workflow log level applies.log-errors-only
: log all caught exceptions; or when a call is stopped due to an exception.log-all-calls
: log all calls to subworkflows or library functions and their results.log-none
: no call logging.
DATA
(optional): runtime arguments for your workflow in JSON format.
If you ran
gcloud workflows execute
, the unique ID of the workflow execution attempt is returned and the output is similar to the following:To view the workflow status, you can use following command: gcloud workflows executions describe b113b589-8eff-4968-b830-8d35696f0b33 --workflow workflow-2 --location us-central1
To view the status of the execution, enter the command returned by the previous step.
If the execution attempt is successful, the output is similar to the
following, with a state
indicating the workflow's success, and a status
that specifies the final workflow step of the execution.
argument: '{"searchTerm":"Friday"}' endTime: '2022-06-22T12:17:53.086073678Z' name: projects/1051295516635/locations/us-central1/workflows/myFirstWorkflow/executions/c4dffd1f-13db-46a0-8a4a-ee39c144cb96 result: '["Friday","Friday the 13th (franchise)","Friday Night Lights (TV series)","Friday the 13th (1980 film)","Friday the 13th","Friday the 13th (2009 film)","Friday the 13th Part III","Friday the 13th Part 2","Friday (Rebecca Black song)","Friday Night Lights (film)"]' startTime: '2022-06-22T12:17:52.799387653Z' state: SUCCEEDED status: currentSteps: - routine: main step: returnOutput workflowRevisionId: 000001-ac2
Client libraries
The following samples assume you have already deployed a workflow,
myFirstWorkflow
.
Install the client library and set up your development environment. For details, see the Workflows client libraries overview.
Clone the sample app repository to your local machine:
Java
git clone https://github.com/GoogleCloudPlatform/java-docs-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Node.js
git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Python
git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Change to the directory that contains the Workflows sample code:
Java
cd java-docs-samples/workflows/cloud-client/
Node.js
cd nodejs-docs-samples/workflows/quickstart/
Python
cd python-docs-samples/workflows/cloud-client/
Take a look at the sample code:
Java
Node.js (JavaScript)
Node.js (TypeScript)
Python
The sample does the following:
- Sets up the Cloud Client Libraries for Workflows.
- Executes a workflow.
- Polls the workflow's execution (using exponential backoff) until the execution terminates.
- Prints the execution results.
To run the sample, first install dependencies:
Java
mvn compile
Node.js (JavaScript)
npm install
Node.js (TypeScript)
npm install && npm run build
Python
pip3 install -r requirements.txt
Run the script:
Java
GOOGLE_CLOUD_PROJECT=PROJECT_ID LOCATION=CLOUD_REGION WORKFLOW=WORKFLOW_NAME mvn compile exec:java -Dexec.mainClass=com.example.workflows.WorkflowsQuickstart
Node.js (JavaScript)
npm start PROJECT_ID CLOUD_REGION WORKFLOW_NAME
Node.js (TypeScript)
npm start PROJECT_ID CLOUD_REGION WORKFLOW_NAME
Python
GOOGLE_CLOUD_PROJECT=PROJECT_ID LOCATION=CLOUD_REGION WORKFLOW=WORKFLOW_NAME python3 main.py
Replace the following:
PROJECT_ID
(required): the Project ID of the Google Cloud projectCLOUD_REGION
: the location for the workflow (default:us-central1
)WORKFLOW_NAME
: the ID of the workflow (default:myFirstWorkflow
)
The output is similar to the following:
Execution finished with state: SUCCEEDED ["Sunday","Sunday in the Park with George","Sunday shopping","Sunday Bloody Sunday","Sunday Times Golden Globe Race","Sunday All Stars","Sunday Night (South Korean TV series)","Sunday Silence","Sunday Without God","Sunday Independent (Ireland)"]
REST API
To create a new execution using the latest revision of a given workflow, use
the
projects.locations.workflows.executions.create
method.
Note that to authenticate, you will need a service account with sufficient
privileges to execute the workflow. For example, you can grant a service
account the Workflows Invoker role
(roles/workflows.invoker
) so that the account has permission to
trigger your workflow execution. For more information, see
Invoke Workflows.
Before using any of the request data, make the following replacements:
PROJECT_NUMBER
: your Google Cloud project number listed in the IAM & Admin Settings page.LOCATION
: the region in which the workflow is deployed—for example,us-central1
.WORKFLOW_NAME
: the user-defined name for the workflow—for example,myFirstWorkflow
.PARAMETER
: optional. If the workflow you are executing can receive runtime arguments that you pass it as part of an execution request, you can add to the request body a JSON-formatted string whose value is one or more escaped parameter-value pairs—for example,"{\"searchTerm\":\"asia\"}"
.VALUE
: optional. The value of a parameter-value pair that your workflow can receive as a runtime argument.CALL_LOGGING_LEVEL
: optional. The call logging level to apply during execution. The default is that no logging level is specified and the workflow log level applies instead. For more information, see Send Logs to Logging. One of the following:CALL_LOG_LEVEL_UNSPECIFIED
: no logging level is specified and the workflow log level applies instead. This is the default. Otherwise, the execution log level applies and takes precedence over the workflow log level.LOG_ERRORS_ONLY
: log all caught exceptions; or when a call is stopped due to an exception.LOG_ALL_CALLS
: log all calls to subworkflows or library functions and their results.LOG_NONE
: no call logging.
BACKLOG_EXECUTION
: optional. If set totrue
, the execution is not backlogged when the concurrency quota is exhausted. For more information, see Manage execution backlogging.
Request JSON body:
{ "argument": "{\"PARAMETER\":\"VALUE\"}", "callLogLevel": "CALL_LOGGING_LEVEL", "disableConcurrencyQuotaOverflowBuffering": "BACKLOG_EXECUTION" }
To send your request, expand one of these options:
If successful, the response body contains a newly created instance of Execution
:
{ "name": "projects/PROJECT_NUMBER/locations/LOCATION/workflows/WORKFLOW_NAME/executions/EXECUTION_ID", "startTime": "2023-11-07T14:35:27.215337069Z", "state": "ACTIVE", "argument": "{\"PARAMETER\":\"VALUE\"}", "workflowRevisionId": "000001-2df", "callLogLevel": "CALL_LOGGING_LEVEL", "status": {} }
Check the status of executions
There are several commands to help you check the status of a workflow execution.
To retrieve a list of a workflow's execution attempts and their IDs, enter the following command:
gcloud workflows executions list WORKFLOW_NAME
Replace
WORKFLOW_NAME
with the name of the workflow.The command returns a
NAME
value that is similar to the following:projects/PROJECT_NUMBER/locations/REGION/workflows/WORKFLOW_NAME/executions/EXECUTION_ID
Copy the execution ID to use in the next command.
To check the status of an execution attempt and wait for the attempt to finish, enter the following command:
gcloud workflows executions wait EXECUTION_ID
Replace
EXECUTION_ID
with the execution attempt's ID.The command waits for the execution attempt to finish and then returns the results.
To wait until the last execution is complete and then return the result of the completed execution, enter the following command:
gcloud workflows executions wait-last
If you made a previous execution attempt in the same
gcloud
session, the command waits for the prior execution attempt to finish and then returns the results of the completed execution. If no previous attempt exists,gcloud
returns the following error:ERROR: (gcloud.workflows.executions.wait-last) [NOT FOUND] There are no cached executions available.
To get the status of the last execution, enter the following command:
gcloud workflows executions describe-last
If you made a previous execution attempt in the same
gcloud
session, the command returns the results of the last execution even if it is running. If no previous attempt exists,gcloud
returns the following error:ERROR: (gcloud.beta.workflows.executions.describe-last) [NOT FOUND] There are no cached executions available.
Filter executions
You can apply filters to the list of workflow executions returned by the
workflows.executions.list
method.
You can filter on the following fields:
createTime
disableOverflowBuffering
duration
endTime
executionId
label
startTime
state
stepName
workflowRevisionId
For example, to filter on a label (labels."fruit":"apple"
), you can make an
API request similar to the following:
GET https://workflowexecutions.googleapis.com/v1/projects/MY_PROJECT/locations/MY_LOCATION/workflows/MY_WORKFLOW/executions?view=full&filter=labels.%22fruit%22%3A%22apple%22"
Where:
view=full
specifies a view defining which fields should be filled in the returned executions; in this case, all the datalabels.%22fruit%22%3A%22apple%22
is the URL-encoded filter syntax
For more information, see AIP-160 Filtering.
Manage execution backlogging
You can use execution backlogging to avoid client-side retries, remove execution delays, and maximize throughput. Backlogged executions automatically run as soon as execution concurrency quota becomes available.
There is a maximum number of active workflow
executions that can run concurrently. Once this quota is exhausted, and if
execution backlogging is disabled, or if the quota for backlogged executions is
reached, any new executions fail with an HTTP 429 Too many requests
status
code. With execution backlogging enabled, the new executions succeed and are
created in a QUEUED
state. As soon as execution concurrency quota becomes
available, the executions automatically run and enter an ACTIVE
state.
By default, execution backlogging is enabled for all requests (including those triggered by Cloud Tasks) with the following exceptions:
- When creating an execution using an
executions.run
orexecutions.create
connector in a workflow, execution backlogging is disabled by default. You can configure it by explicitly setting the execution'sdisableConcurrencyQuotaOverflowBuffering
field tofalse
. - For executions triggered by Pub/Sub, execution backlogging is disabled and can't be configured.
Note the following:
- Queued executions are started in a first-in-first-out (FIFO) order, on a best-effort basis.
- A
createTime
timestamp field indicates when an execution is created. ThestartTime
timestamp indicates when an execution is automatically popped from the backlog queue and starts running. For executions that are not backlogged, both timestamp values are identical. - The limit for backlogged executions can be observed using the
workflowexecutions.googleapis.com/executionbacklogentries
quota metric. For more information, see View and manage quotas.
Disable execution backlogging
You can disable execution backlogging by setting a flag when using the the Google Cloud CLI. For example:
gcloud workflows execute WORKFLOW_NAME --disable-concurrency-quota-overflow-buffering
Or, you can disable execution backlogging by setting the
disableConcurrencyQuotaOverflowBuffering
field to true
in the request JSON
body when sending an execution request to the Workflows REST API.
For example:
{ "argument": {"arg1":"value1"}, "callLogLevel": "LOG_NONE", "disableConcurrencyQuotaOverflowBuffering": true }
For more information, see Execute a workflow.