[[["이해하기 쉬움","easyToUnderstand","thumb-up"],["문제가 해결됨","solvedMyProblem","thumb-up"],["기타","otherUp","thumb-up"]],[["이해하기 어려움","hardToUnderstand","thumb-down"],["잘못된 정보 또는 샘플 코드","incorrectInformationOrSampleCode","thumb-down"],["필요한 정보/샘플이 없음","missingTheInformationSamplesINeed","thumb-down"],["번역 문제","translationIssue","thumb-down"],["기타","otherDown","thumb-down"]],["최종 업데이트: 2025-08-27(UTC)"],[[["\u003cp\u003eEnabling Dataproc cluster caching improves Spark job performance by caching frequently accessed Cloud Storage data on local SSDs, reducing data retrieval time and storage costs.\u003c/p\u003e\n"],["\u003cp\u003eCluster caching applies to all Spark jobs on the cluster, whether submitted to the Dataproc service or run independently, and this applies only to Cloud Storage data.\u003c/p\u003e\n"],["\u003cp\u003eCluster caching is only compatible with clusters meeting specific criteria, such as having one master and \u003ccode\u003en\u003c/code\u003e workers, supported image versions (\u003ccode\u003e2.0.72+\u003c/code\u003e, \u003ccode\u003e2.1.20+\u003c/code\u003e, \u003ccode\u003e2.2.0+\u003c/code\u003e), NVME local SSDs, and the default VM service account.\u003c/p\u003e\n"],["\u003cp\u003eYou can enable cluster caching during Dataproc cluster creation through the Google Cloud console, gcloud CLI, or the Dataproc API, using the property \u003ccode\u003edataproc:dataproc.cluster.caching.enabled=true\u003c/code\u003e.\u003c/p\u003e\n"]]],[],null,["When you enable Dataproc cluster caching, the cluster caches\nCloud Storage data frequently accessed by your Spark jobs.\n\nBenefits\n\n- **Improved performance:** Caching can improve job performance by reducing the amount of time spent retrieving data from storage.\n- **Reduced storage costs:** Since hot data is cached on local disk, fewer API calls are made to storage to retrieve data.\n- **Spark job applicability**: When cluster caching is enabled on a cluster, it applies to all Spark jobs run on the cluster, whether submitted to the Dataproc service or run independently on the cluster.\n\nLimitations and requirements\n\n- Caching applies to Dataproc Spark jobs only.\n- Only Cloud Storage data is cached.\n- Caching only applies to clusters that meet the following requirements:\n - The cluster has one master and `n` workers ([High Availability (HA)](/dataproc/docs/concepts/configuring-clusters/high-availability) and [single node](/dataproc/docs/concepts/configuring-clusters/single-node-clusters) clusters are not supported).\n - This feature is available in Dataproc on Compute Engine [image versions](/dataproc/docs/concepts/versioning/dataproc-version-clusters#supported-dataproc-image-versions) `2.0.72+`, `2.1.20+`, and `2.2.0+`.\n - Each cluster node must have [local SSDs](/dataproc/docs/concepts/compute/dataproc-local-ssds) attached with the [NVME (Non-Volatile Memory Express)](/compute/docs/disks/local-ssd#nvme) interface (Persistent Disks (PDs) are not supported). Data is cached on NVME local SSDs only.\n - The cluster uses the [default VM service account](/dataproc/docs/concepts/configuring-clusters/service-accounts#VM_service_account) for authentication. [Custom VM service accounts](/dataproc/docs/concepts/configuring-clusters/service-accounts#create_a_cluster_with_a_custom_vm_service_account) are not supported.\n\nEnable cluster caching\n\nYou can enable cluster caching when you create a Dataproc cluster\nusing the Google Cloud console, Google Cloud CLI, or the Dataproc API. \n\nGoogle Cloud console\n\n- Open the Dataproc [**Create a cluster on Compute Engine**](https://console.cloud.google.com/dataproc/clustersAdd) page in the Google Cloud console.\n- The **Set up cluster** panel is selected. In the **Spark performance enhancements** section, select **Enable Google Cloud Storage caching**.\n- After confirming and specifying cluster details in the cluster create panels, click **Create**.\n\ngcloud CLI\n\nRun the [gcloud dataproc clusters create](/sdk/gcloud/reference/dataproc/clusters/create)\ncommand locally in a terminal window or in\n[Cloud Shell](https://console.cloud.google.com/?cloudshell=true)\nusing the `dataproc:dataproc.cluster.caching.enabled=true`\n[cluster property](/dataproc/docs/concepts/configuring-clusters/cluster-properties#dataproc_service_properties_table).\n\nExample: \n\n```\ngcloud dataproc clusters create CLUSTER_NAME \\\n --region=REGION \\\n --properties dataproc:dataproc.cluster.caching.enabled=true \\\n --num-master-local-ssds=2 \\\n --master-local-ssd-interface=NVME \\\n --num-worker-local-ssds=2 \\\n --worker-local-ssd-interface=NVME \\\n other args ...\n \n```\n\nREST API\n\nSet [SoftwareConfig.properties](/static/dataproc/docs/reference/rest/v1/ClusterConfig#SoftwareConfig.FIELDS.properties)\nto include the `\"dataproc:dataproc.cluster.caching.enabled\": \"true\"`\n[cluster property](/dataproc/docs/concepts/configuring-clusters/cluster-properties#dataproc_service_properties_table)\nas part of a\n[clusters.create](/dataproc/docs/reference/rest/v1/projects.regions.clusters/create)\nrequest."]]