- 1.75.0 (latest)
- 1.74.0
- 1.73.0
- 1.72.0
- 1.71.1
- 1.70.0
- 1.69.0
- 1.68.0
- 1.67.1
- 1.66.0
- 1.65.0
- 1.63.0
- 1.62.0
- 1.60.0
- 1.59.0
- 1.58.0
- 1.57.0
- 1.56.0
- 1.55.0
- 1.54.1
- 1.53.0
- 1.52.0
- 1.51.0
- 1.50.0
- 1.49.0
- 1.48.0
- 1.47.0
- 1.46.0
- 1.45.0
- 1.44.0
- 1.43.0
- 1.39.0
- 1.38.1
- 1.37.0
- 1.36.4
- 1.35.0
- 1.34.0
- 1.33.1
- 1.32.0
- 1.31.1
- 1.30.1
- 1.29.0
- 1.28.1
- 1.27.1
- 1.26.1
- 1.25.0
- 1.24.1
- 1.23.0
- 1.22.1
- 1.21.0
- 1.20.0
- 1.19.1
- 1.18.3
- 1.17.1
- 1.16.1
- 1.15.1
- 1.14.0
- 1.13.1
- 1.12.1
- 1.11.0
- 1.10.0
- 1.9.0
- 1.8.1
- 1.7.1
- 1.6.2
- 1.5.0
- 1.4.3
- 1.3.0
- 1.2.0
- 1.1.1
- 1.0.1
- 0.9.0
- 0.8.0
- 0.7.1
- 0.6.0
- 0.5.1
- 0.4.0
- 0.3.1
Classes for working with language models.
Classes
ChatModel
ChatModel(model_id: str, endpoint_name: Optional[str] = None)
ChatModel represents a language model that is capable of chat.
Examples::
chat_model = ChatModel.from_pretrained("chat-bison@001")
chat = chat_model.start_chat(
context="My name is Ned. You are my personal assistant. My favorite movies are Lord of the Rings and Hobbit.",
examples=[
InputOutputTextPair(
input_text="Who do you work for?",
output_text="I work for Ned.",
),
InputOutputTextPair(
input_text="What do I like?",
output_text="Ned likes watching movies.",
),
],
temperature=0.3,
)
chat.send_message("Do you know any cool events this weekend?")
ChatSession
ChatSession(
model: vertexai.language_models._language_models.ChatModel,
context: Optional[str] = None,
examples: Optional[
List[vertexai.language_models._language_models.InputOutputTextPair]
] = None,
max_output_tokens: int = 128,
temperature: float = 0.0,
top_k: int = 40,
top_p: float = 0.95,
message_history: Optional[
List[vertexai.language_models._language_models.ChatMessage]
] = None,
)
ChatSession represents a chat session with a language model.
Within a chat session, the model keeps context and remembers the previous conversation.
CodeChatModel
CodeChatModel(model_id: str, endpoint_name: Optional[str] = None)
CodeChatModel represents a model that is capable of completing code.
.. rubric:: Examples
code_chat_model = CodeChatModel.from_pretrained("codechat-bison@001")
code_chat = code_chat_model.start_chat( max_output_tokens=128, temperature=0.2, )
code_chat.send_message("Please help write a function to calculate the min of two numbers")
CodeChatSession
CodeChatSession(
model: vertexai.language_models._language_models.CodeChatModel,
max_output_tokens: int = 128,
temperature: float = 0.5,
)
CodeChatSession represents a chat session with code chat language model.
Within a code chat session, the model keeps context and remembers the previous converstion.
CodeGenerationModel
CodeGenerationModel(model_id: str, endpoint_name: Optional[str] = None)
A language model that generates code.
.. rubric:: Examples
Getting answers:
generation_model = CodeGenerationModel.from_pretrained("code-bison@001") print(generation_model.predict( prefix="Write a function that checks if a year is a leap year.", ))
completion_model = CodeGenerationModel.from_pretrained("code-gecko@001") print(completion_model.predict( prefix="def reverse_string(s):", ))
InputOutputTextPair
InputOutputTextPair(input_text: str, output_text: str)
InputOutputTextPair represents a pair of input and output texts.
TextEmbedding
TextEmbedding(values: List[float], _prediction_response: Optional[Any] = None)
Contains text embedding vector.
TextGenerationResponse
TextGenerationResponse(text: str, _prediction_response: Any, is_blocked: bool = False, safety_attributes: Dict[str, float] = <factory>)
TextGenerationResponse represents a response of a language model. .. attribute:: text
The generated text
Scores for safety attributes. Learn more about the safety attributes here: https://cloud.google.com/vertex-ai/docs/generative-ai/learn/responsible-ai#safety_attribute_descriptions