A partir de 29 de abril de 2025, os modelos Gemini 1.5 Pro e Gemini 1.5 Flash não estarão disponíveis em projetos que não os usaram antes, incluindo novos projetos. Para mais detalhes, consulte
Versões e ciclo de vida do modelo.
Gerar respostas de texto usando a API Gemini com chamadas de função externas em um cenário de chat
Mantenha tudo organizado com as coleções
Salve e categorize o conteúdo com base nas suas preferências.
Gere respostas de texto usando a API Gemini com chamadas de função externas. Neste exemplo, demonstramos um cenário de chat com duas funções e dois
comandos sequenciais.
Exemplo de código
Exceto em caso de indicação contrária, o conteúdo desta página é licenciado de acordo com a Licença de atribuição 4.0 do Creative Commons, e as amostras de código são licenciadas de acordo com a Licença Apache 2.0. Para mais detalhes, consulte as políticas do site do Google Developers. Java é uma marca registrada da Oracle e/ou afiliadas.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Difícil de entender","hardToUnderstand","thumb-down"],["Informações incorretas ou exemplo de código","incorrectInformationOrSampleCode","thumb-down"],["Não contém as informações/amostras de que eu preciso","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],[],[],[],null,["# Generate text responses using Gemini API with external function calls in a chat scenario\n\nGenerate text responses using Gemini API with external function calls. This example demonstrates a chat scenario with two functions and two sequential prompts.\n\nCode sample\n-----------\n\n### Node.js\n\n\nBefore trying this sample, follow the Node.js setup instructions in the\n[Vertex AI quickstart using\nclient libraries](/vertex-ai/docs/start/client-libraries).\n\n\nFor more information, see the\n[Vertex AI Node.js API\nreference documentation](/nodejs/docs/reference/aiplatform/latest).\n\n\nTo authenticate to Vertex AI, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n const {\n VertexAI,\n FunctionDeclarationSchemaType,\n } = require('https://cloud.google.com/nodejs/docs/reference/vertexai/latest/overview.html');\n\n const functionDeclarations = [\n {\n function_declarations: [\n {\n name: 'get_current_weather',\n description: 'get weather in a given location',\n parameters: {\n type: https://cloud.google.com/nodejs/docs/reference/vertexai/latest/overview.html.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/schematype.html,\n properties: {\n location: {type: https://cloud.google.com/nodejs/docs/reference/vertexai/latest/overview.html.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/schematype.html},\n unit: {\n type: https://cloud.google.com/nodejs/docs/reference/vertexai/latest/overview.html.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/schematype.html,\n enum: ['celsius', 'fahrenheit'],\n },\n },\n required: ['location'],\n },\n },\n ],\n },\n ];\n\n const functionResponseParts = [\n {\n functionResponse: {\n name: 'get_current_weather',\n response: {name: 'get_current_weather', content: {weather: 'super nice'}},\n },\n },\n ];\n\n /**\n * TODO(developer): Update these variables before running the sample.\n */\n async function functionCallingStreamChat(\n projectId = 'PROJECT_ID',\n location = 'us-central1',\n model = 'gemini-2.0-flash-001'\n ) {\n // Initialize Vertex with your Cloud project and location\n const vertexAI = new https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/vertexai.html({project: projectId, location: location});\n\n // Instantiate the model\n const generativeModel = vertexAI.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/vertexai.html({\n model: model,\n });\n\n // Create a chat session and pass your function declarations\n const chat = generativeModel.startChat({\n tools: functionDeclarations,\n });\n\n const chatInput1 = 'What is the weather in Boston?';\n\n // This should include a functionCall response from the model\n const result1 = await chat.sendMessageStream(chatInput1);\n for await (const item of result1.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/streamgeneratecontentresult.html) {\n console.log(item.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/generatecontentresponse.html[0]);\n }\n await result1.response;\n\n // Send a follow up message with a FunctionResponse\n const result2 = await chat.sendMessageStream(functionResponseParts);\n for await (const item of result2.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/streamgeneratecontentresult.html) {\n console.log(item.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/generatecontentresponse.html[0]);\n }\n\n // This should include a text response from the model using the response content\n // provided above\n const response2 = await result2.response;\n console.log(response2.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/generatecontentresponse.html[0].https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/generatecontentcandidate.html.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/content.html[0].text);\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=generativeaionvertexai)."]]