You have several options for protecting the data and resources that you transfer.
Protecting your on-premises resources
Agents access files from the environment they are running in. This means that you have several ways that you can protect access to your data:
Using a restricted user or role account to run the agent container.
Limiting the file systems that are mounted to the agent container.
Choose NFS mount permissions in accordance with your security policies, such as no write access.
Protecting data in-flight
Transfer service for on-premises data encrypts your data over an HTTPS session with TLS for both connections through the public internet, and through private connections (such as Cloud Interconnect). If you are using Cloud Interconnect, you can obtain an additional layer of security by using private API endpoints.
Protecting Google Cloud resources
gcloud auth to connect to Pub/Sub and Cloud Storage
resources used during the transfer. Therefore your Google Cloud resources are
protected using Identity and Access Management and the account that you choose to provision for
Transfer service for on-premises data agent use. You may also use a service
account, which can help make
permissions management easier to use.
Transfer service for on-premises data supports the following Storage Transfer Service predefined IAM roles:
Storage Transfer Admin — Provides all Storage Transfer Service permissions.
Storage Transfer User — Can submit and monitor jobs, but can't delete jobs or see admin settings such as agent details or bandwidth settings.
Transfer service for on-premises data doesn't support custom IAM roles or the Storage Transfer Viewer predefined role. Users in either scenario may not see a polished user interface. If they attempt to load pages they don't have permissions for, the page will display an error or a blank page. However, the permitted actions remain restricted.