Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Penyedia bertanggung jawab untuk membuat dan menghapus cluster cloud
tempat pipeline dijalankan. Penyedia yang berbeda dapat
membuat berbagai jenis cluster di berbagai cloud.
Setiap penyedia mengekspos serangkaian setelan konfigurasi yang mengontrol jenis
cluster yang dibuat untuk dijalankan. Misalnya, penyedia Dataproc dan Amazon EMR memiliki setelan ukuran cluster. Penyedia juga memiliki
setelan untuk kredensial yang diperlukan untuk berkomunikasi dengan cloud masing-masing dan
menyediakan node komputasi yang diperlukan.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[[["\u003cp\u003eProvisioners manage the creation and deletion of cloud clusters for pipeline execution.\u003c/p\u003e\n"],["\u003cp\u003eDifferent provisioners enable the creation of various cluster types on different cloud platforms.\u003c/p\u003e\n"],["\u003cp\u003eProvisioners offer configuration settings to define the characteristics of the created cluster and credentials for cloud access.\u003c/p\u003e\n"],["\u003cp\u003eCloud Data Fusion supports Dataproc, Amazon EMR, and Remote Hadoop provisioners.\u003c/p\u003e\n"]]],[],null,["# Provisioners in Cloud Data Fusion\n\nA provisioner is responsible for creating and tearing down the cloud cluster\nwhere the pipeline is executed. Different provisioners are capable of\ncreating different types of clusters on various clouds.\n\nEach provisioner exposes a set of configuration settings that control the type\nof cluster that's created for a run. For example, the Dataproc\nand Amazon EMR provisioners have cluster size settings. Provisioners also have\nsettings for the credentials required to talk to their respective clouds and\nprovision the required compute nodes.\n\nSupported provisioners in Cloud Data Fusion\n-------------------------------------------\n\nCloud Data Fusion supports the following provisioners:\n\n[Dataproc](/data-fusion/docs/concepts/dataproc)\n: A fast, easy-to-use, and fully-managed cloud service for running Apache Spark\n and Apache Hadoop clusters.\n\nAmazon Elastic MapReduce (EMR)\n: Provides a managed Hadoop framework that processes vast amounts of data across\n dynamically scalable Amazon EC2 instances.\n\nRemote Hadoop\n: Runs jobs on a pre-existing Hadoop cluster, either on-premises or in the\n cloud."]]