Module language_models (1.26.1)

Classes for working with language models.

Classes

ChatModel

ChatModel(model_id: str, endpoint_name: Optional[str] = None)

ChatModel represents a language model that is capable of chat.

Examples::

chat_model = ChatModel.from_pretrained("chat-bison@001")

chat = chat_model.start_chat(
    context="My name is Ned. You are my personal assistant. My favorite movies are Lord of the Rings and Hobbit.",
    examples=[
        InputOutputTextPair(
            input_text="Who do you work for?",
            output_text="I work for Ned.",
        ),
        InputOutputTextPair(
            input_text="What do I like?",
            output_text="Ned likes watching movies.",
        ),
    ],
    temperature=0.3,
)

chat.send_message("Do you know any cool events this weekend?")

ChatSession

ChatSession(
    model: vertexai.language_models._language_models.ChatModel,
    context: Optional[str] = None,
    examples: Optional[
        List[vertexai.language_models._language_models.InputOutputTextPair]
    ] = None,
    max_output_tokens: int = 128,
    temperature: float = 0.0,
    top_k: int = 40,
    top_p: float = 0.95,
    message_history: Optional[
        List[vertexai.language_models._language_models.ChatMessage]
    ] = None,
)

ChatSession represents a chat session with a language model.

Within a chat session, the model keeps context and remembers the previous conversation.

CodeChatModel

CodeChatModel(model_id: str, endpoint_name: Optional[str] = None)

CodeChatModel represents a model that is capable of completing code.

.. rubric:: Examples

code_chat_model = CodeChatModel.from_pretrained("codechat-bison@001")

code_chat = code_chat_model.start_chat( max_output_tokens=128, temperature=0.2, )

code_chat.send_message("Please help write a function to calculate the min of two numbers")

CodeChatSession

CodeChatSession(
    model: vertexai.language_models._language_models.CodeChatModel,
    max_output_tokens: int = 128,
    temperature: float = 0.5,
)

CodeChatSession represents a chat session with code chat language model.

Within a code chat session, the model keeps context and remembers the previous converstion.

CodeGenerationModel

CodeGenerationModel(model_id: str, endpoint_name: Optional[str] = None)

A language model that generates code.

.. rubric:: Examples

Getting answers:

generation_model = CodeGenerationModel.from_pretrained("code-bison@001") print(generation_model.predict( prefix="Write a function that checks if a year is a leap year.", ))

completion_model = CodeGenerationModel.from_pretrained("code-gecko@001") print(completion_model.predict( prefix="def reverse_string(s):", ))

InputOutputTextPair

InputOutputTextPair(input_text: str, output_text: str)

InputOutputTextPair represents a pair of input and output texts.

TextEmbedding

TextEmbedding(values: List[float], _prediction_response: Optional[Any] = None)

Contains text embedding vector.

TextGenerationResponse

TextGenerationResponse(text: str, _prediction_response: Any, is_blocked: bool = False, safety_attributes: Dict[str, float] = <factory>)

TextGenerationResponse represents a response of a language model. .. attribute:: text

The generated text

Scores for safety attributes. Learn more about the safety attributes here: https://cloud.google.com/vertex-ai/docs/generative-ai/learn/responsible-ai#safety_attribute_descriptions