[[["容易理解","easyToUnderstand","thumb-up"],["確實解決了我的問題","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["難以理解","hardToUnderstand","thumb-down"],["資訊或程式碼範例有誤","incorrectInformationOrSampleCode","thumb-down"],["缺少我需要的資訊/範例","missingTheInformationSamplesINeed","thumb-down"],["翻譯問題","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["上次更新時間:2025-09-04 (世界標準時間)。"],[[["\u003cp\u003eGoogle's responsibilities for Dataproc include securing the underlying infrastructure, releasing security patches for Dataproc images, providing Google Cloud integrations, restricting and logging administrative access, and recommending configuration best practices.\u003c/p\u003e\n"],["\u003cp\u003eCustomers are responsible for maintaining their workloads, including application code, custom images, data, IAM policy, and clusters, and ensuring they run on up-to-date Dataproc images.\u003c/p\u003e\n"],["\u003cp\u003eGoogle encrypts data at rest and in transit, utilizes custom-designed hardware, and implements private network cables as part of their infrastructure protection.\u003c/p\u003e\n"],["\u003cp\u003eCustomers should leverage the latest subminor image version of Dataproc images, and migrate to the most recent minor image version when possible.\u003c/p\u003e\n"],["\u003cp\u003eGoogle will request customers for environmental details to be used for troubleshooting purposes.\u003c/p\u003e\n"]]],[],null,["Running business-critical workloads on Dataproc requires multiple parties to\ncarry different responsibilities. While not an exhaustive list, this page lists\nthe responsibilities for Google and the customer.\n\nDataproc: Google responsibilities\n\n- Protecting the underlying infrastructure, including hardware, firmware, kernel,\n OS, storage, network, and more. This includes:\n\n - [encrypting data at rest by default](/security/encryption-at-rest/default-encryption)\n - providing [additional customer-managed disk encryption](/dataproc/docs/concepts/configuring-clusters/customer-managed-encryption)\n - [encrypting data in transit](https://cloud.google.com/security/encryption-in-transit)\n - using [custom-designed hardware](/docs/security/titan-hardware-chip)\n - laying [private network cables](https://cloud.google.com/about/locations#network-tab)\n - protecting data centers from physical access\n - protecting the bootloader and kernel against modification using [Shielded Nodes](/kubernetes-engine/docs/how-to/shielded-gke-nodes)\n - providing network protection with [VPC Service Controls](https://cloud.google.com/vpc-service-controls/docs/supported-products)\n - following secure software development practices\n- Releasing security patches for Dataproc images . This includes:\n\n - patches for the base operating systems included in [Dataproc images](/dataproc/docs/concepts/versioning/dataproc-version-clusters) (Ubuntu, Debian, and Rocky Linux)\n - patches and fixes available for the [open source components](/dataproc/docs/concepts/versioning/dataproc-release-2.1) included in Dataproc images Security patches may only be available for operating system versions or open source software that are included in the most recent version of Dataproc images. Leveraging the latest Dataproc image version available is a customer responsibility.\n- Providing Google Cloud integrations for Connect, Identity and Access Management,\n Cloud Audit Logs, Cloud Key Management Service, Security Command Center, and others.\n\n- Restricting and logging Google administrative access to customer clusters for\n contractual support purposes with [Access Transparency](/access-transparency)\n and [Access Approval](/assured-workloads/access-approval/docs/overview)\n\n- Recommending best practices for configuring Dataproc and the open source\n components included in Dataproc images\n\nDataproc: Customer responsibilities\n\n- Maintaining your workloads, including your application code, custom images, data,\n IAM policy, and clusters that you run\n\n- Running clusters on up-to-date Dataproc images\n by leveraging the latest\n [subminor image version](/dataproc/docs/concepts/versioning/dataproc-version-clusters#debian_images),\n promptly refreshing your custom images, and migrating to the most recent minor\n image version as soon as it is feasible. Image metadata includes a\n `previous-subminor` label, which is set to `true` if the cluster is not\n using the latest subminor image version. For information on how to view\n image metadata, see\n [Important notes about versioning](/dataproc/docs/concepts/versioning/overview#important_notes_about_versioning).\n\n- Providing Google with environmental details when requested for troubleshooting\n purposes\n\n- Following best practices for the configuration of Dataproc and other Google Cloud\n services, and for the configuration of open source components included in\n Dataproc images"]]