How Google Cloud helps navigate your DPIA and AI privacy compliance journey
Marc Crandall
Director and Global Head of Privacy, Google Cloud
Nathaly Rey
Director, Global Regulatory Affairs, Google Cloud
At Google, we understand that new technology applications such as artificial intelligence driven innovation can introduce new questions about data privacy. We are committed to helping our customers meet their data protection obligations while using AI offerings integrated in Google Cloud services (which includes Google Workspace services.)
Some customers may need to carry out Data Protection Impact Assessments (DPIA) or similar assessments under their local laws for personal data that will be processed when they use Google Cloud services. To assist them in these efforts, we’re continually improving our DPIA Resource Center with updated content and guidance.
We have also published a new paper on generative AI, privacy, and Google Cloud to aid customers and regulators seeking clarity on the use of AI technology.
Understanding DPIAs
A DPIA is a documented process undertaken by a customer as a data controller to describe, assess, and manage the data protection risks of a project. It also can be used to demonstrate compliance with data protection obligations.
A “project” could be, for example, an organization’s use of Google Workspace for Education to support teaching, learning, and collaboration, or its use of Google Cloud to securely process customer data. While Google can not conduct DPIAs on behalf of our customers, we provide assistance by making information and other resources available that can help you complete the DPIA for your use of Google Cloud services.
As technology and our customers' needs evolve, we continue to develop the DPIA Resource Center in support of its three main goals:
-
To help you determine what a DPIA is, and whether you need a DPIA for your use of Google Cloud services;
-
To provide guidance on how to approach preparing a DPIA;
-
To offer resources that can assist your completion of a DPIA, including downloadable DPIA templates.
While our DPIA Resource Center is primarily designed to provide a foundational overview of the DPIA requirements under the GDPR, we expect that the information that we provide about Google Cloud services and our data processing can help customers in all regions to carry out similar assessments under their local laws.
Customers benefit from a streamlined navigation and downloadable templates for a better structuring and documenting of their DPIA process.
Preserving privacy in Google Cloud AI offerings
Our essential commitments on customer data remain unchanged for our Google Cloud AI offerings. This means our customers can utilize the DPIA Resource Center and other resources to assess our Google Cloud AI offerings.
We are committed to preserving our customers' privacy and supporting their compliance journey. Our essential commitments include:
-
Your data is your data. The data and content generated by a generative AI service, prompted by customer data (“generated output”) is considered customer data that Google only processes according to customer's instructions. We continue to maintain that customers control their data and we process it according to the agreement(s) we have with each customer.
-
Your data does not train our models. We recognize that customers want their data to be private and not be shared with the broader Google or foundation model training corpus. We do not use data that you provide us to train our own models without your permission.
-
We provide enterprise-grade privacy and security. Google Cloud’s AI offerings, such as our Vertex AI platform and foundation models, are built with enterprise-grade safety, security, and privacy from the beginning.
You can learn more about Google Cloud and Google Workspace privacy and AI commitments here.