See the supported connectors for Application Integration.
Dataflow - Create Job task
The Dataflow - Create Job task lets you create a job in Cloud Dataflow to run a data pipeline built using one of the Apache Beam SDKs.
Cloud Dataflow is a fully managed Google Cloud service for running stream and Batch data processing pipelines.
Before you begin
Ensure that you perform the following tasks in your Google Cloud project before configuring the Dataflow - Create Job task:
- Enable the Dataflow API (
dataflow.googleapis.com
). - Create an authentication profile. Application Integration uses an authentication profile to connect to an authentication endpoint for the Dataflow - Create Job task.
For information about granting additional roles or permissions to a service account, see Granting, changing, and revoking access.
Configure the Dataflow - Create Job task
- In the Google Cloud console, go to the Application Integration page.
- In the navigation menu, click Integrations.
The Integrations page appears listing all the integrations available in the Google Cloud project.
- Select an existing integration or click Create integration to create a new one.
If you are creating a new integration:
- Enter a name and description in the Create Integration pane.
- Select a region for the integration.
- Select a service account for the integration. You can change or update the service account details of an integration any time from the Integration summary pane in the integration toolbar.
- Click Create.
This opens the integration in the integration editor.
- In the integration editor navigation bar, click Tasks to view the list of available tasks and connectors.
- Click and place the Dataflow - Create Job element in the integration editor.
- Click the Dataflow - Create Job element on the designer to view the Dataflow - Create Job task configuration pane.
- Go to Authentication, and select an existing authentication profile that you want to use.
Optional. If you have not created an authentication profile prior to configuring the task, Click + New authentication profile and follow the steps as mentioned in Create a new authentication profile.
- Go to Task Input, and configure the displayed inputs fields using the following Task input parameters table.
Changes to the inputs fields are saved automatically.
Task input parameters
The following table describes the input parameters of the Dataflow - Create Job task:
Property | Data type | Description |
---|---|---|
Region | String | Cloud Dataflow location for the job. |
ProjectsId | String | Your Google Cloud project ID. |
Location | String | The regional endpoint that contains this job. |
Request | JSON | See request JSON structure. |
Task output
The Dataflow - Create Job task returns the newly created instance of the Job.
Error handling strategy
An error handling strategy for a task specifies the action to take if the task fails due to a temporary error. For information about how to use an error handling strategy, and to know about the different types of error handling strategies, see Error handling strategies.
Quotas and limits
For information about quotas and limits, see Quotas and limits.
What's next
- Add edges and edge conditions.
- Test and publish your integration.
- Configure a trigger.
- Add a Data Mapping task.
- See all tasks for Google Cloud services.