Module llm (1.3.0)

LLM models.

Classes

GeminiTextGenerator

GeminiTextGenerator(
    *,
    session: typing.Optional[bigframes.session.Session] = None,
    connection_name: typing.Optional[str] = None
)

Gemini text generator LLM model.

Parameters
NameDescription
session bigframes.Session or None

BQ session to create the model. If None, use the global default session.

connection_name str or None

Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.

PaLM2TextEmbeddingGenerator

PaLM2TextEmbeddingGenerator(
    *,
    model_name: typing.Literal[
        "textembedding-gecko", "textembedding-gecko-multilingual"
    ] = "textembedding-gecko",
    version: typing.Optional[str] = None,
    session: typing.Optional[bigframes.session.Session] = None,
    connection_name: typing.Optional[str] = None
)

PaLM2 text embedding generator LLM model.

Parameters
NameDescription
model_name str, Default to "textembedding-gecko"

The model for text embedding. “textembedding-gecko” returns model embeddings for text inputs. "textembedding-gecko-multilingual" returns model embeddings for text inputs which support over 100 languages Default to "textembedding-gecko".

version str or None

Model version. Accepted values are "001", "002", "003", "latest" etc. Will use the default version if unset. See https://cloud.google.com/vertex-ai/docs/generative-ai/learn/model-versioning for details.

session bigframes.Session or None

BQ session to create the model. If None, use the global default session.

connection_name str or None

connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.

PaLM2TextGenerator

PaLM2TextGenerator(
    *,
    model_name: typing.Literal["text-bison", "text-bison-32k"] = "text-bison",
    session: typing.Optional[bigframes.session.Session] = None,
    connection_name: typing.Optional[str] = None,
    max_iterations: int = 300
)

PaLM2 text generator LLM model.

Parameters
NameDescription
model_name str, Default to "text-bison"

The model for natural language tasks. “text-bison” returns model fine-tuned to follow natural language instructions and is suitable for a variety of language tasks. "text-bison-32k" supports up to 32k tokens per request. Default to "text-bison".

session bigframes.Session or None

BQ session to create the model. If None, use the global default session.

connection_name str or None

Connection to connect with remote service. str of the format <PROJECT_NUMBER/PROJECT_ID>.

max_iterations Optional[int], Default to 300

The number of steps to run when performing supervised tuning.