Starting April 29, 2025, Gemini 1.5 Pro and Gemini 1.5 Flash models are not available in projects that have no prior usage of these models, including new projects. For details, see Model versions and lifecycle.
Optional. The maximum number of output tokens to generate per message.
responseMimeType
responseMimeType?:string;
Optional. Output response mimetype of the generated candidate text. Supported mimetype: - text/plain: (default) Text output. - application/json: JSON response in the candidates. The model needs to be prompted to output the appropriate response type, otherwise the behavior is undefined. This is a preview feature.
stopSequences
stopSequences?:string[];
Optional. Stop sequences.
temperature
temperature?:number;
Optional. Controls the randomness of predictions.
topK
topK?:number;
Optional. If specified, topK sampling will be used.
topP
topP?:number;
Optional. If specified, nucleus sampling will be used.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[],[],null,["# Interface GenerationConfig (1.4.1)\n\nConfiguration options for model generation and outputs.\n\nPackage\n-------\n\n[@google-cloud/vertexai](../overview.html)\n\nProperties\n----------\n\n### candidateCount\n\n candidateCount?: number;\n\nOptional. Number of candidates to generate.\n\n### maxOutputTokens\n\n maxOutputTokens?: number;\n\nOptional. The maximum number of output tokens to generate per message.\n\n### responseMimeType\n\n responseMimeType?: string;\n\nOptional. Output response mimetype of the generated candidate text. Supported mimetype: - `text/plain`: (default) Text output. - `application/json`: JSON response in the candidates. The model needs to be prompted to output the appropriate response type, otherwise the behavior is undefined. This is a preview feature.\n\n### stopSequences\n\n stopSequences?: string[];\n\nOptional. Stop sequences.\n\n### temperature\n\n temperature?: number;\n\nOptional. Controls the randomness of predictions.\n\n### topK\n\n topK?: number;\n\nOptional. If specified, topK sampling will be used.\n\n### topP\n\n topP?: number;\n\nOptional. If specified, nucleus sampling will be used."]]