Genera respuestas de texto con la API de Gemini con llamadas a funciones externas en una situación de chat
Organiza tus páginas con colecciones
Guarda y categoriza el contenido según tus preferencias.
Genera respuestas de texto con la API de Gemini con llamadas a funciones externas. En este ejemplo, se muestra una situación de chat con dos funciones y dos prompts secuenciales.
Explora más
Para obtener documentación en la que se incluye esta muestra de código, consulta lo siguiente:
Muestra de código
Python
Antes de probar este ejemplo, sigue las instrucciones de configuración para Python incluidas en la guía de inicio rápido de Vertex AI sobre cómo usar bibliotecas cliente.
Para obtener más información, consulta la documentación de referencia de la API de Vertex AI Python.
Para autenticarte en Vertex AI, configura las credenciales predeterminadas de la aplicación.
Si deseas obtener más información, consulta Configura la autenticación para un entorno de desarrollo local.
import vertexai
from vertexai.generative_models import (
FunctionDeclaration,
GenerativeModel,
Part,
Tool,
)
def generate_function_call_chat(project_id: str, location: str) -> tuple:
prompts = []
summaries = []
# Initialize Vertex AI
vertexai.init(project=project_id, location=location)
# Specify a function declaration and parameters for an API request
get_product_info_func = FunctionDeclaration(
name="get_product_sku",
description="Get the SKU for a product",
# Function parameters are specified in OpenAPI JSON schema format
parameters={
"type": "object",
"properties": {
"product_name": {"type": "string", "description": "Product name"}
},
},
)
# Specify another function declaration and parameters for an API request
get_store_location_func = FunctionDeclaration(
name="get_store_location",
description="Get the location of the closest store",
# Function parameters are specified in OpenAPI JSON schema format
parameters={
"type": "object",
"properties": {"location": {"type": "string", "description": "Location"}},
},
)
# Define a tool that includes the above functions
retail_tool = Tool(
function_declarations=[
get_product_info_func,
get_store_location_func,
],
)
# Initialize Gemini model
model = GenerativeModel(
"gemini-1.0-pro", generation_config={"temperature": 0}, tools=[retail_tool]
)
# Start a chat session
chat = model.start_chat()
# Send a prompt for the first conversation turn that should invoke the get_product_sku function
prompt = "Do you have the Pixel 8 Pro in stock?"
response = chat.send_message(prompt)
prompts.append(prompt)
# Check the function name that the model responded with, and make an API call to an external system
if response.candidates[0].content.parts[0].function_call.name == "get_product_sku":
# Extract the arguments to use in your API call
product_name = (
response.candidates[0].content.parts[0].function_call.args["product_name"]
)
product_name
# Here you can use your preferred method to make an API request to retrieve the product SKU, as in:
# api_response = requests.post(product_api_url, data={"product_name": product_name})
# In this example, we'll use synthetic data to simulate a response payload from an external API
api_response = {"sku": "GA04834-US", "in_stock": "yes"}
# Return the API response to Gemini so it can generate a model response or request another function call
response = chat.send_message(
Part.from_function_response(
name="get_product_sku",
response={
"content": api_response,
},
),
)
# Extract the text from the summary response
summary = response.candidates[0].content.parts[0].text
summaries.append(summary)
# Send a prompt for the second conversation turn that should invoke the get_store_location function
prompt = "Is there a store in Mountain View, CA that I can visit to try it out?"
response = chat.send_message(prompt)
prompts.append(prompt)
# Check the function name that the model responded with, and make an API call to an external system
if (
response.candidates[0].content.parts[0].function_call.name
== "get_store_location"
):
# Extract the arguments to use in your API call
location = (
response.candidates[0].content.parts[0].function_call.args["location"]
)
location
# Here you can use your preferred method to make an API request to retrieve store location closest to the user, as in:
# api_response = requests.post(store_api_url, data={"location": location})
# In this example, we'll use synthetic data to simulate a response payload from an external API
api_response = {"store": "2000 N Shoreline Blvd, Mountain View, CA 94043, US"}
# Return the API response to Gemini so it can generate a model response or request another function call
response = chat.send_message(
Part.from_function_response(
name="get_store_location",
response={
"content": api_response,
},
),
)
# Extract the text from the summary response
summary = response.candidates[0].content.parts[0].text
summaries.append(summary)
return prompts, summaries
Salvo que se indique lo contrario, el contenido de esta página está sujeto a la licencia Atribución 4.0 de Creative Commons, y los ejemplos de código están sujetos a la licencia Apache 2.0. Para obtener más información, consulta las políticas del sitio de Google Developers. Java es una marca registrada de Oracle o sus afiliados.
[[["Fácil de comprender","easyToUnderstand","thumb-up"],["Resolvió mi problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],[],[],[]]