Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Anda dapat menginstal komponen tambahan seperti Docker saat membuat cluster Dataproc menggunakan fitur Komponen opsional. Halaman ini menjelaskan komponen Docker.
Komponen Dataproc menginstal
daemon Docker
di setiap node cluster dan membuat pengguna Linux "docker" dan grup Linux
"docker" di setiap node untuk menjalankan daemon Docker. Komponen ini juga membuat
layanan systemd "docker"
untuk menjalankan layanan dockerd. Anda harus menggunakan layanan systemd untuk mengelola siklus proses layanan Docker.
Menginstal komponen
Instal komponen saat Anda membuat cluster Dataproc.
Komponen Docker dapat diinstal pada cluster yang dibuat dengan
versi gambar 1.5
atau yang lebih baru Dataproc.
Untuk membuat cluster Dataproc yang menyertakan komponen Docker, gunakan perintah gcloud dataproc clusters createcluster-name dengan flag --optional-components.
Komponen Docker Dataproc mengonfigurasi Docker untuk
menggunakan Container Registry selain registry Docker default.
Docker akan menggunakan helper kredensial Docker untuk melakukan autentikasi dengan
Container Registry.
[[["Mudah dipahami","easyToUnderstand","thumb-up"],["Memecahkan masalah saya","solvedMyProblem","thumb-up"],["Lainnya","otherUp","thumb-up"]],[["Sulit dipahami","hardToUnderstand","thumb-down"],["Informasi atau kode contoh salah","incorrectInformationOrSampleCode","thumb-down"],["Informasi/contoh yang saya butuhkan tidak ada","missingTheInformationSamplesINeed","thumb-down"],["Masalah terjemahan","translationIssue","thumb-down"],["Lainnya","otherDown","thumb-down"]],["Terakhir diperbarui pada 2025-09-04 UTC."],[[["\u003cp\u003eThe Docker component, installable on Dataproc clusters, provides a Docker daemon, a "docker" Linux user and group, and a "docker" systemd service on each cluster node.\u003c/p\u003e\n"],["\u003cp\u003eInstallation of the Docker component is supported on Dataproc clusters with image version 1.5 or later, and can be installed when creating a Dataproc cluster.\u003c/p\u003e\n"],["\u003cp\u003eThe component can be installed using the \u003ccode\u003egcloud\u003c/code\u003e command with the \u003ccode\u003e--optional-components=DOCKER\u003c/code\u003e flag, via the Dataproc API, or through the Google Cloud console during cluster creation.\u003c/p\u003e\n"],["\u003cp\u003eBy default, the Dataproc Docker component directs logs to Cloud Logging and configures Docker to use Container Registry in addition to the standard Docker registries.\u003c/p\u003e\n"],["\u003cp\u003eThe Docker component can be installed on clusters with Kerberos security enabled, but containers interacting with Hadoop services need their own Kerberos credentials.\u003c/p\u003e\n"]]],[],null,["You can install additional components like Docker when you create a Dataproc\ncluster using the\n[Optional components](/dataproc/docs/concepts/components/overview#available_optional_components)\nfeature. This page describes the Docker component.\n\nThe Dataproc component installs a\n[Docker daemon](https://docs.docker.com/get-started/overview/#the-docker-daemon)\non each cluster node and creates a Linux user \"docker\" and a Linux group\n\"docker\" on each node to run the Docker daemon. This component also creates\na \"docker\" [`systemd`](https://docs.docker.com/config/daemon/systemd/)\nservice to run the [`dockerd`](https://docs.docker.com/engine/reference/commandline/dockerd/) service. You should use the `systemd` service to manage the\nlifecycle of the Docker service.\n\nInstall the component\n\nInstall the component when you create a Dataproc cluster.\nThe Docker component can be installed on clusters created with\nDataproc **image [version 1.5](/dataproc/docs/concepts/versioning/dataproc-release-1.5)\nor later**.\n\nSee\n[Supported Dataproc versions](/dataproc/docs/concepts/versioning/dataproc-versions#supported_cloud_dataproc_versions)\nfor the component version included in each Dataproc image release. \n\ngcloud command\n\nTo create a Dataproc cluster that includes the Docker component,\nuse the\n[gcloud dataproc clusters create](/sdk/gcloud/reference/dataproc/clusters/create) \u003cvar translate=\"no\"\u003ecluster-name\u003c/var\u003e\ncommand with the `--optional-components` flag. \n\n```\ngcloud dataproc clusters create cluster-name \\\n --optional-components=DOCKER \\\n --region=region \\\n --image-version=1.5 \\\n ... other flags\n```\n\nREST API\n\nThe Docker component can be specified through the Dataproc API using\n[SoftwareConfig.Component](/dataproc/docs/reference/rest/v1/ClusterConfig#Component)\nas part of a\n[clusters.create](/dataproc/docs/reference/rest/v1/projects.regions.clusters/create)\nrequest.\n\nConsole\n\n1. Enable the component.\n - In the Google Cloud console, open the Dataproc [Create a cluster](https://console.cloud.google.com/dataproc/clustersAdd) page. The Set up cluster panel is selected.\n - In the Components section:\n - Under Optional components, select Docker and other optional components to install on your cluster.\n\nEnable Docker on YARN\n\nSee [Customize your Spark job runtime environment with Docker on YARN](/dataproc/docs/guides/dataproc-docker-yarn)\nto use a customized Docker image with YARN.\n\nDocker Logging\n\nBy default, the Dataproc Docker component writes logs to\nCloud Logging by [setting the `gcplogs driver`](/community/tutorials/docker-gcplogs-driver#setting_the_default_logging_driver)---see\n[Viewing your logs](/community/tutorials/docker-gcplogs-driver#viewing_your_logs).\n\nDocker Registry\n\nThe Dataproc Docker component configures Docker to\nuse Container Registry in addition to the default Docker registries.\nDocker will use the Docker credential helper to authenticate with\nContainer Registry.\n\nUse the Docker component on a Kerberos cluster\n\nYou can install the Docker optional component on a cluster that is\nbeing created with\n[Kerberos security enabled](/dataproc/docs/concepts/configuring-clusters/security#enabling_hadoop_secure_mode_via_kerberos).\n| Docker is not part of the Hadoop ecosystem, and isn't recognized by Hadoop services. If you run a container that communicates with Hadoop services directly, your container must have the required Kerberos keytab file and credential."]]