Interface LlmModelSettings.ParametersOrBuilder (0.89.0)

public static interface LlmModelSettings.ParametersOrBuilder extends MessageOrBuilder

Implements

MessageOrBuilder

Methods

getInputTokenLimit()

public abstract LlmModelSettings.Parameters.InputTokenLimit getInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
LlmModelSettings.Parameters.InputTokenLimit

The inputTokenLimit.

getInputTokenLimitValue()

public abstract int getInputTokenLimitValue()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
int

The enum numeric value on the wire for inputTokenLimit.

getOutputTokenLimit()

public abstract LlmModelSettings.Parameters.OutputTokenLimit getOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
LlmModelSettings.Parameters.OutputTokenLimit

The outputTokenLimit.

getOutputTokenLimitValue()

public abstract int getOutputTokenLimitValue()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
int

The enum numeric value on the wire for outputTokenLimit.

getTemperature()

public abstract float getTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
float

The temperature.

hasInputTokenLimit()

public abstract boolean hasInputTokenLimit()

The input token limit. This setting is currently only supported by playbooks.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.InputTokenLimit input_token_limit = 2;

Returns
Type Description
boolean

Whether the inputTokenLimit field is set.

hasOutputTokenLimit()

public abstract boolean hasOutputTokenLimit()

The output token limit. This setting is currently only supported by playbooks. Only one of output_token_limit and max_output_tokens is allowed to be set.

optional .google.cloud.dialogflow.cx.v3beta1.LlmModelSettings.Parameters.OutputTokenLimit output_token_limit = 3;

Returns
Type Description
boolean

Whether the outputTokenLimit field is set.

hasTemperature()

public abstract boolean hasTemperature()

The temperature used for sampling during response generation. Value ranges from 0 to 1. Temperature controls the degree of randomness in token selection. Lower temperature means less randomness, while higher temperature means more randomness. Valid range: [0.0, 1.0]

optional float temperature = 1;

Returns
Type Description
boolean

Whether the temperature field is set.