密码、密钥和其他数据会安全地存储在 Cloud Data Fusion 中,并使用存储在 Cloud Key Management Service 中的密钥进行加密。在运行时,Cloud Data Fusion 会调用 Cloud Key Management Service 来检索用于解密存储的 Secret 的密钥。
[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[[["\u003cp\u003eSecurity-sensitive workloads should be placed in separate Google Cloud projects for strict isolation, with role-based access control enabled to manage resource access within Cloud Data Fusion instances.\u003c/p\u003e\n"],["\u003cp\u003eTo reduce the risk of data exfiltration and to ensure the instance isn't publicly accessible, users should enable internal IP addresses and VPC service controls in Cloud Data Fusion instances.\u003c/p\u003e\n"],["\u003cp\u003ePrivate Cloud Data Fusion instances, connected to a VPC Network through VPC peering or Private Service Connect, use internal IP addresses and are not exposed to the public internet, offering enhanced security.\u003c/p\u003e\n"],["\u003cp\u003ePipeline execution's ingress and egress are controlled by setting firewall rules on the customer VPC, and data within Cloud Data Fusion is encrypted, at rest using Google-managed keys, and in transit using TLS v1.2.\u003c/p\u003e\n"],["\u003cp\u003eUsers should be cautious when installing plugins or artifacts, as untrusted ones may present a security risk to Cloud Data Fusion.\u003c/p\u003e\n"]]],[],null,["# Security overview\n\nSecurity recommendations\n------------------------\n\nFor workloads that require a strong security boundary or isolation, consider the\nfollowing:\n\n- To enforce strict isolation, place security-sensitive workloads in a\n different Google Cloud project.\n\n- To control access to specific resources, enable\n [role-based access control](/data-fusion/docs/concepts/rbac) in your\n Cloud Data Fusion instances.\n\n- To ensure that the instance isn't publicly accessible and to reduce the risk\n of sensitive data exfiltration,\n enable [internal IP addresses](/data-fusion/docs/how-to/create-private-ip)\n and [VPC service controls (VPC-SC)](/data-fusion/docs/how-to/using-vpc-sc)\n in your instances.\n\nAuthentication\n--------------\n\nThe Cloud Data Fusion web UI supports authentication mechanisms supported\nby Google Cloud console, with access controlled through [Identity and Access Management](/iam/docs).\n\nNetworking controls\n-------------------\n\nYou can create a [private Cloud Data Fusion instance](/data-fusion/docs/how-to/create-private-ip),\nwhich can be connected to your VPC Network through [VPC peering](/data-fusion/docs/how-to/create-private-ip#set-up-vpc-peering)\nor [Private Service Connect](/data-fusion/docs/how-to/configure-private-service-connect#configure-psc).\nPrivate Cloud Data Fusion instances have an internal IP address, and aren't\nexposed to the public internet. Additional security is available using\n[VPC Service Controls](/data-fusion/docs/how-to/using-vpc-sc) to establish a\nsecurity perimeter around a Cloud Data Fusion private instance.\n\nFor more information, see the\n[Cloud Data Fusion networking overview](/data-fusion/docs/concepts/networking).\n\n### Pipeline execution on pre-created internal IP Dataproc clusters\n\nYou can use a private Cloud Data Fusion instance with the\n[remote Hadoop provisioner](https://cdap.atlassian.net/wiki/spaces/DOCS/pages/480313996).\nThe Dataproc cluster must be on the\nVPC network [peered](/data-fusion/docs/how-to/create-private-ip#set_up_network_peering)\nwith Cloud Data Fusion. The remote Hadoop provisioner is configured with\nthe internal IP address of the master node of the Dataproc\ncluster.\n\nAccess control\n--------------\n\n- Managing access to the Cloud Data Fusion instance:\n RBAC-enabled instances support managing access at a namespace level through\n Identity and Access Management. RBAC-disabled instances only support managing access at an\n instance level. If you have access to an instance, you have access to\n all pipelines and metadata in that instance.\n\n- Pipeline access to your data: Pipeline access to data is provided by\n granting access to the [service account](#service_accounts), which can be a\n custom service account that you specify.\n\n### Firewall rules\n\nFor a pipeline execution, you control ingress and egress by setting the\nappropriate firewall rules on the customer VPC on which the pipeline is being\nexecuted.\n\nFor more information, see\n[Firewall rules](/data-fusion/docs/concepts/networking#firewall-rules).\n\nKey storage\n-----------\n\nPasswords, keys, and other data are securely stored in Cloud Data Fusion\nand encrypted using keys stored in [Cloud Key Management Service](/kms/docs). At runtime,\nCloud Data Fusion calls Cloud Key Management Service to retrieve the key used to decrypt\nstored secrets.\n\nEncryption\n----------\n\nBy default, data is encrypted at rest using\n[Google-owned and Google-managed encryption keys](/storage/docs/encryption/default-keys),\nand in transit using TLS v1.2. You use\n[customer-managed encryption keys (CMEK)](/data-fusion/docs/how-to/customer-managed-encryption-keys)\nto control the data written by Cloud Data Fusion pipelines, including\nDataproc cluster metadata and Cloud Storage,\nBigQuery, and Pub/Sub data sources and sinks.\n\nService accounts\n----------------\n\nCloud Data Fusion pipelines execute in Dataproc clusters in\nthe customer project, and can be configured to run using a customer-specified\n(custom) service account. A custom service account must be granted the\n[Service Account User](/iam/docs/understanding-roles#service-accounts-roles)\nrole.\n\nProjects\n--------\n\nCloud Data Fusion services are created in Google-managed tenant projects\nthat users can't access. Cloud Data Fusion pipelines execute on\nDataproc clusters inside customer projects. Customers can access\nthese clusters during their lifetime.\n\nAudit logs\n----------\n\nCloud Data Fusion audit logs are available from [Logging](/logging/docs).\n\nPlugins and artifacts\n---------------------\n\nOperators and Admins should be wary of installing untrusted plugins or\nartifacts, as they might present a security risk.\n\nWorkforce identity federation\n-----------------------------\n\n[Workforce identity federation](/iam/docs/workforce-identity-federation) users\ncan perform operations in Cloud Data Fusion, such as creating, deleting,\nupgrading, and listing instances. For more information about limitations, see\n[Workforce identity federation: supported products and limitations](/iam/docs/federated-identity-supported-services#data-fusion)."]]