GenerationConfig

Generation config.

Fields
stopSequences[] string

Optional. Stop sequences.

responseMimeType string

Optional. Output response mimetype of the generated candidate text. Supported mimetype: - text/plain: (default) Text output. - application/json: JSON response in the candidates. The model needs to be prompted to output the appropriate response type, otherwise the behavior is undefined. This is a preview feature.

temperature number

Optional. Controls the randomness of predictions.

topP number

Optional. If specified, nucleus sampling will be used.

topK number

Optional. If specified, top-k sampling will be used.

candidateCount integer

Optional. Number of candidates to generate.

maxOutputTokens integer

Optional. The maximum number of output tokens to generate per message.

responseLogprobs boolean

Optional. If true, export the logprobs results in response.

logprobs integer

Optional. Logit probabilities.

presencePenalty number

Optional. Positive penalties.

frequencyPenalty number

Optional. Frequency penalties.

seed integer

Optional. Seed.

responseSchema object (Schema)

Optional. The Schema object allows the definition of input and output data types. These types can be objects, but also primitives and arrays. Represents a select subset of an OpenAPI 3.0 schema object. If set, a compatible responseMimeType must also be set. Compatible mimetypes: application/json: Schema for JSON response.

JSON representation
{
  "stopSequences": [
    string
  ],
  "responseMimeType": string,
  "temperature": number,
  "topP": number,
  "topK": number,
  "candidateCount": integer,
  "maxOutputTokens": integer,
  "responseLogprobs": boolean,
  "logprobs": integer,
  "presencePenalty": number,
  "frequencyPenalty": number,
  "seed": integer,
  "responseSchema": {
    object (Schema)
  }
}