Tugas Dataflow - Create Job memungkinkan Anda membuat tugas di Cloud Dataflow untuk menjalankan pipeline data yang dibangun menggunakan salah satu Apache Beam SDK.
Cloud Dataflow adalah layanan Google Cloud terkelola sepenuhnya untuk menjalankan pipeline pemrosesan data streaming dan Batch.
Sebelum memulai
Pastikan Anda melakukan tugas berikut di project Google Cloud sebelum mengonfigurasi tugas Dataflow - Create Job:
Halaman Integrasi akan muncul dan mencantumkan semua integrasi yang tersedia di project Google Cloud.
Pilih integrasi yang ada atau klik Buat integrasi untuk membuat integrasi baru.
Jika Anda membuat integrasi baru:
Masukkan nama dan deskripsi di panel Buat Integrasi.
Pilih region untuk integrasi.
Pilih akun layanan untuk integrasi. Anda dapat mengubah atau memperbarui detail akun layanan integrasi kapan saja dari panel infoRingkasan integrasi di toolbar integrasi.
Klik Buat. Integrasi yang baru dibuat akan terbuka di editor integrasi.
Di menu navigasi editor integrasi, klik Tugas untuk melihat daftar tugas dan konektor yang tersedia.
Klik dan tempatkan elemen Dataflow - Create Job di editor integrasi.
Klik elemen Dataflow - Create Job di perancang untuk melihat panel konfigurasi tugas Dataflow - Create Job.
Buka Authentication, lalu pilih profil autentikasi yang ada yang ingin Anda gunakan.
Opsional. Jika Anda belum membuat profil autentikasi sebelum mengonfigurasi tugas, klik + Profil autentikasi baru dan ikuti langkah-langkah seperti yang disebutkan dalam Membuat profil autentikasi baru.
Buka Input Tugas, lalu konfigurasi kolom input yang ditampilkan menggunakan tabel Parameter input tugas berikut.
Perubahan pada kolom input akan disimpan secara otomatis.
Parameter input tugas
Tabel berikut menjelaskan parameter input tugas Dataflow - Create Job:
Tugas Dataflow - Create Job menampilkan instance Job yang baru dibuat.
Strategi penanganan error
Strategi penanganan error untuk tugas menentukan tindakan yang harus dilakukan jika tugas gagal karena error sementara. Untuk mengetahui informasi tentang cara menggunakan strategi penanganan error, dan untuk mengetahui berbagai jenis strategi penanganan error, lihat Strategi penanganan error.
Kuota dan batas
Untuk mengetahui informasi tentang kuota dan batas, lihat Kuota dan batas.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-03 UTC."],[[["\u003cp\u003eThe Dataflow - Create Job task allows users to create and run data pipelines within Cloud Dataflow, a managed Google Cloud service for stream and batch data processing.\u003c/p\u003e\n"],["\u003cp\u003eBefore configuring this task, users must enable the Dataflow API and create an authentication profile for Application Integration to connect to the appropriate authentication endpoint.\u003c/p\u003e\n"],["\u003cp\u003eConfiguring the Dataflow - Create Job task involves selecting an authentication profile and specifying input parameters such as region, project ID, location, and a JSON request structure.\u003c/p\u003e\n"],["\u003cp\u003eThe task returns a new instance of a Dataflow Job upon successful execution, which is outlined in the documentation on supported connectors.\u003c/p\u003e\n"],["\u003cp\u003eThe Dataflow - Create Job task is currently in a Pre-GA phase, subject to the "Pre-GA Offerings Terms," meaning it is available "as is" with potential limitations in support.\u003c/p\u003e\n"]]],[],null,["# Dataflow - Create Job task\n\nSee the [supported connectors](/integration-connectors/docs/connector-reference-overview) for Application Integration.\n\nDataflow - Create Job task\n==========================\n\n|\n| **Preview**\n|\n|\n| This feature is subject to the \"Pre-GA Offerings Terms\" in the General Service Terms section\n| of the [Service Specific Terms](/terms/service-terms#1).\n|\n| Pre-GA features are available \"as is\" and might have limited support.\n|\n| For more information, see the\n| [launch stage descriptions](/products#product-launch-stages).\n\nThe **Dataflow - Create Job** task lets you create a job in Cloud Dataflow to run a data pipeline built using one of the Apache Beam SDKs.\n\n[Cloud Dataflow](/dataflow/docs/about-dataflow) is a fully managed Google Cloud service for running stream and Batch data processing pipelines.\n\nBefore you begin\n----------------\n\nEnsure that you perform the following tasks in your Google Cloud project before configuring the **Dataflow - Create Job** task:\n\n1. Enable the Dataflow API (`dataflow.googleapis.com`).\n\n\n [Enable the Dataflow API](https://console.cloud.google.com/flows/enableapi?apiid=dataflow.googleapis.com)\n2. Create an [authentication profile](/application-integration/docs/configuring-auth-profile#createAuthProfile). Application Integration uses an authentication profile to connect to an authentication endpoint for the **Dataflow - Create Job** task.\n\n For information about granting additional roles or permissions to a service account, see [Granting, changing, and revoking access](/iam/docs/granting-changing-revoking-access).\n\nConfigure the Dataflow - Create Job task\n----------------------------------------\n\n1. In the Google Cloud console, go to the **Application Integration** page.\n\n [Go to Application Integration](https://console.cloud.google.com/integrations)\n2. In the navigation menu, click **Integrations** .\n\n\n The **Integrations** page appears listing all the integrations available in the Google Cloud project.\n3. Select an existing integration or click **Create integration** to create a new one.\n\n\n If you are creating a new integration:\n 1. Enter a name and description in the **Create Integration** pane.\n 2. Select a region for the integration. **Note:** The **Regions** dropdown only lists the regions provisioned in your Google Cloud project. To provision a new region, click **Enable Region** . See [Enable new region](/application-integration/docs/enable-new-region) for more information.\n 3. Select a service account for the integration. You can change or update the service account details of an integration any time from the info **Integration summary** pane in the integration toolbar. **Note:** The option to select a service account is displayed only if you have enabled integration governance for the selected region.\n 4. Click **Create** . The newly created integration opens in the *integration editor*.\n\n\n4. In the *integration editor* navigation bar, click **Tasks** to view the list of available tasks and connectors.\n5. Click and place the **Dataflow - Create Job** element in the integration editor.\n6. Click the **Dataflow - Create Job** element on the designer to view the **Dataflow - Create Job** task configuration pane.\n7. Go to **Authentication** , and select an existing authentication profile that you want to use.\n\n Optional. If you have not created an authentication profile prior to configuring the task, Click **+ New authentication profile** and follow the steps as mentioned in [Create a new authentication profile](/application-integration/docs/configuring-auth-profile#createAuthProfile).\n8. Go to **Task Input** , and configure the displayed inputs fields using the following [Task input parameters](#params) table.\n\n Changes to the inputs fields are saved automatically.\n\nTask input parameters\n---------------------\n\n\nThe following table describes the input parameters of the **Dataflow - Create Job** task:\n\nTask output\n-----------\n\nThe **Dataflow - Create Job** task returns the newly created instance of the [Job](/dataflow/docs/reference/rest/v1b3/projects.jobs#Job).\n\nError handling strategy\n-----------------------\n\n\nAn error handling strategy for a task specifies the action to take if the task fails due to a [temporary error](/application-integration/docs/error-handling). For information about how to use an error handling strategy, and to know about the different types of error handling strategies, see [Error handling strategies](/application-integration/docs/error-handling-strategy).\n\nQuotas and limits\n-----------------\n\nFor information about quotas and limits, see [Quotas and limits](/application-integration/docs/quotas).\n\nWhat's next\n-----------\n\n- Add [edges and edge conditions](/application-integration/docs/edge-overview).\n- [Test and publish](/application-integration/docs/test-publish-integrations) your integration.\n- Configure a [trigger](/application-integration/docs/how-to-guides#configure-triggers).\n- Add a [Data Mapping task](/application-integration/docs/data-mapping-task).\n- See [all tasks for Google Cloud services](/application-integration/docs/how-to-guides#configure-tasks-for-google-cloud-services)."]]