How Gemini for Google Cloud for Google Cloud works

The Gemini large language models (LLMs) that are used by Gemini for Google Cloud are trained on datasets of publicly available code, Google Cloud-specific material, and other relevant technical information in addition to the datasets used to train the Gemini foundation models. Models are trained so that Gemini for Google Cloud responses are as useful to Google Cloud users as possible.

Additionally, Gemini in the Google Cloud console includes basic user contextual information (project ID, product area, page title, and organization ID) when giving responses. Google Cloud CLI (gcloud) commands and other code may incorporate user context directly into the generated code snippet. Gemini doesn't persist contextual information.

Gemini for Google Cloud doesn't use your prompts or generated responses for training or fine-tuning our underlying models. Any data sent to the Gemini for Google Cloud models is used strictly for serving a response to the request, and unless instructed by you, isn't stored. There are no controls available to filter or block certain information from being sent to Gemini for Google Cloud.

Gemini for Google Cloud serves requests as close to the user as possible where capacity is available. For more information on where Gemini for Google Cloud serves from, see Gemini for Google Cloud locations.

How and when Gemini for Google Cloud cites sources

Gemini for Google Cloud LLMs, like some other standalone LLM experiences, are intended to generate original content and not replicate existing content at length. We've designed our systems to limit the chances of this occurring, and we continue to improve how these systems function.

If Gemini for Google Cloud directly quotes at length from a web page, it cites that page. For answers with URLs, Gemini for Google Cloud lets users see and, in some cases, click to navigate directly to the source page.

When generating code or offering code completion, Gemini for Google Cloud provides citation information when it directly quotes at length from another source, such as existing open source code. In the case of citations to code repositories, the citation might also reference an applicable open source license.

How Gemini for Google Cloud helps protect you with generative AI indemnification

Gemini for Google Cloud is covered as a Generative AI Indemnified Service.

If you are challenged on copyright grounds after using content generated by Gemini for Google Cloud, we assume certain responsibility for the potential legal risks involved.

For full details about the indemnity, see our Service Specific Terms, or read our blog post on this issue.

Gemini for Google Cloud for Google Cloud products

The following sections provide additional details for specific Gemini for Google Cloud for Google Cloud products.

Gemini Code Assist

To allow for better code generation in IDEs, Gemini Code Assist gathers contextual information from the file that you're actively using in your IDE as well as other open and relevant local files in your project.

When working with Gemini Code Assist in your IDE, Gemini lists your project files (the context sources) that were used as reference to generate responses to your prompts. Context sources are shown every time you use Gemini chat.

You can prevent Gemini Code Assist from suggesting code that matches cited sources by adjusting settings in your IDE (VS Code, Cloud Shell Editor, and Cloud Workstations).

Code customization lets you get code suggestions based on your organization's private codebase directly from Gemini Code Assist. To learn more about code customization, and how we provide security when accessing and storing your private code, see the Gemini Code Assist overview. To configure and use code customization, see Configure and use Gemini Code Assist code customization.

For more information about Gemini Code Assist security controls, see Security, privacy, and compliance for Gemini Code Assist on Google Cloud.