How Gemini for Google Cloud works

Gemini large language models (LLMs) are trained on a large codebase of publicly available code, Google Cloud-specific material, and Stack Overflow posts.

Additionally, Gemini in the Google Cloud console includes basic user contextual information (project ID, product area, page title, and organization ID) when giving responses. Google Cloud CLI (gcloud) commands and other code may incorporate user context directly into the generated code snippet. Gemini doesn't persist contextual information.

Gemini operates as a stateless service to mitigate IP exfiltration risks. Gemini doesn't store your prompts or generated responses to our models, nor use this data for training or fine-tuning our underlying models. Any data sent to the Gemini models is used strictly for serving a response to the request, and not stored, in order to mitigate IP exfiltration risks. There are no controls available to filter or block certain information from being sent to Gemini.

Gemini serves requests as close to the user as possible where capacity is available. For more information on where Gemini serves from, see Gemini locations.

How and when Gemini cites sources

Gemini LLMs, like some other standalone LLM experiences, are intended to generate original content and not replicate existing content at length. We've designed our systems to limit the chances of this occurring, and we continue to improve how these systems function.

If Gemini directly quotes at length from a web page, it cites that page. For answers with URLs, Gemini lets users see and, in some cases, click to navigate directly to the source page.

When generating code or offering code completion, Gemini provides citation information when it directly quotes at length from another source, such as existing open source code. In the case of citations to code repositories, the citation might also reference an applicable open source license.

Gemini Code Assist

To allow for better code generation in IDEs, Gemini Code Assist gathers contextual information from the file that you're actively using in your IDE as well as other open files in your project.

When working with Gemini Code Assist in your IDE, Gemini lists your project files (the context sources) that were used as reference to generate responses to your prompts. Context sources are shown every time you use Gemini chat.

You can prevent Gemini Code Assist from suggesting code that matches cited sources by adjusting settings in Cloud Code (VS Code, IntelliJ, Cloud Shell Editor, and Cloud Workstations).

How Gemini helps protect you with generative AI indemnification

Gemini is covered as a Generative AI Indemnified Service.

If you are challenged on copyright grounds after using content generated by Gemini, we assume responsibility for the potential legal risks involved.

For full details about the indemnity, see our Service Specific Terms, or read our blog post on this issue.