Générer un flux de contenu avec un modèle d'IA multimodal
Restez organisé à l'aide des collections
Enregistrez et classez les contenus selon vos préférences.
L'exemple de code montre comment utiliser des modèles d'IA générative pour générer du texte en flux continu sur la base d'une combinaison d'entrées de type texte, image et vidéo.
Exemple de code
Sauf indication contraire, le contenu de cette page est régi par une licence Creative Commons Attribution 4.0, et les échantillons de code sont régis par une licence Apache 2.0. Pour en savoir plus, consultez les Règles du site Google Developers. Java est une marque déposée d'Oracle et/ou de ses sociétés affiliées.
[[["Facile à comprendre","easyToUnderstand","thumb-up"],["J'ai pu résoudre mon problème","solvedMyProblem","thumb-up"],["Autre","otherUp","thumb-up"]],[["Difficile à comprendre","hardToUnderstand","thumb-down"],["Informations ou exemple de code incorrects","incorrectInformationOrSampleCode","thumb-down"],["Il n'y a pas l'information/les exemples dont j'ai besoin","missingTheInformationSamplesINeed","thumb-down"],["Problème de traduction","translationIssue","thumb-down"],["Autre","otherDown","thumb-down"]],[],[],[],null,["# Generate content stream with Multimodal AI Model\n\nThe code sample demonstrates how to use Generative AI Models to generate text in a streaming format based on a combination of video, image, and text inputs.\n\nCode sample\n-----------\n\n### Go\n\n\nBefore trying this sample, follow the Go setup instructions in the\n[Vertex AI quickstart using\nclient libraries](/vertex-ai/docs/start/client-libraries).\n\n\nFor more information, see the\n[Vertex AI Go API\nreference documentation](/go/docs/reference/cloud.google.com/go/aiplatform/latest/apiv1).\n\n\nTo authenticate to Vertex AI, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import (\n \t\"context\"\n \t\"errors\"\n \t\"fmt\"\n \t\"io\"\n\n \t\"cloud.google.com/go/vertexai/genai\"\n \t\"google.golang.org/api/iterator\"\n )\n\n func generateContent(w io.Writer, projectID, modelName string) error {\n \tctx := context.Background()\n\n \tclient, err := genai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_Client_NewClient(ctx, projectID, \"us-central1\")\n \tif err != nil {\n \t\treturn fmt.Errorf(\"unable to create client: %w\", err)\n \t}\n \tdefer client.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_Client_Close()\n\n \tmodel := client.GenerativeModel(modelName)\n \titer := model.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_GenerativeModel_GenerateContentStream(\n \t\tctx,\n \t\tgenai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_FileData{\n \t\t\tMIMEType: \"video/mp4\",\n \t\t\tFileURI: \"gs://cloud-samples-data/generative-ai/video/animals.mp4\",\n \t\t},\n \t\tgenai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_FileData{\n \t\t\tMIMEType: \"video/jpeg\",\n \t\t\tFileURI: \"gs://cloud-samples-data/generative-ai/image/character.jpg\",\n \t\t},\n \t\tgenai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_Text(\"Are these video and image correlated?\"),\n \t)\n \tfor {\n \t\tresp, err := iter.Next()\n \t\tif err == iterator.Done {\n \t\t\treturn nil\n \t\t}\n \t\tif len(resp.Candidates) == 0 || len(resp.Candidates[0].https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_Content.Parts) == 0 {\n \t\t\treturn errors.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai/tokenizer.html#cloud_google_com_go_vertexai_genai_tokenizer_Tokenizer_New(\"empty response from model\")\n \t\t}\n \t\tif err != nil {\n \t\t\treturn err\n \t\t}\n\n \t\tfmt.Fprint(w, \"generated response: \")\n \t\tfor _, c := range resp.Candidates {\n \t\t\tfor _, p := range c.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/go/latest/genai.html#cloud_google_com_go_vertexai_genai_Content.Parts {\n \t\t\t\tfmt.Fprintf(w, \"%s \", p)\n \t\t\t}\n \t\t}\n \t\tfmt.Fprint(w, \"\\n\")\n \t}\n }\n\n### Java\n\n\nBefore trying this sample, follow the Java setup instructions in the\n[Vertex AI quickstart using\nclient libraries](/vertex-ai/docs/start/client-libraries).\n\n\nFor more information, see the\n[Vertex AI Java API\nreference documentation](/java/docs/reference/google-cloud-aiplatform/latest/com.google.cloud.aiplatform.v1).\n\n\nTo authenticate to Vertex AI, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n import com.google.cloud.vertexai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.VertexAI.html;\n import com.google.cloud.vertexai.generativeai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.generativeai.ContentMaker.html;\n import com.google.cloud.vertexai.generativeai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.generativeai.GenerativeModel.html;\n import com.google.cloud.vertexai.generativeai.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.generativeai.PartMaker.html;\n\n public class StreamingMultimodal {\n public static void main(String[] args) throws Exception {\n // TODO(developer): Replace these variables before running the sample.\n String projectId = \"your-google-cloud-project-id\";\n String location = \"us-central1\";\n String modelName = \"gemini-2.0-flash-001\";\n\n streamingMultimodal(projectId, location, modelName);\n }\n\n // Ask a simple question and get the response via streaming.\n public static void streamingMultimodal(String projectId, String location, String modelName)\n throws Exception {\n // Initialize client that will be used to send requests.\n // This client only needs to be created once, and can be reused for multiple requests.\n try (https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.VertexAI.html vertexAI = new https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.VertexAI.html(projectId, location)) {\n https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.generativeai.GenerativeModel.html model = new https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.generativeai.GenerativeModel.html(modelName, vertexAI);\n\n String videoUri = \"gs://cloud-samples-data/video/animals.mp4\";\n String imgUri = \"gs://cloud-samples-data/generative-ai/image/character.jpg\";\n\n // Stream the result.\n model.https://cloud.google.com/vertex-ai/generative-ai/docs/reference/java/latest/com.google.cloud.vertexai.generativeai.GenerativeModel.html#com_google_cloud_vertexai_generativeai_GenerativeModel_generateContentStream_com_google_cloud_vertexai_api_Content_(\n ContentMaker.fromMultiModalData(\n PartMaker.fromMimeTypeAndData(\"video/mp4\", videoUri),\n PartMaker.fromMimeTypeAndData(\"image/jpeg\", imgUri),\n \"Are this video and image correlated?\"\n ))\n .stream()\n .forEach(System.out::println);\n }\n }\n }\n\n### Node.js\n\n\nBefore trying this sample, follow the Node.js setup instructions in the\n[Vertex AI quickstart using\nclient libraries](/vertex-ai/docs/start/client-libraries).\n\n\nFor more information, see the\n[Vertex AI Node.js API\nreference documentation](/nodejs/docs/reference/aiplatform/latest).\n\n\nTo authenticate to Vertex AI, set up Application Default Credentials.\nFor more information, see\n\n[Set up authentication for a local development environment](/docs/authentication/set-up-adc-local-dev-environment).\n\n const {VertexAI} = require('https://cloud.google.com/nodejs/docs/reference/vertexai/latest/overview.html');\n\n /**\n * TODO(developer): Update these variables before running the sample.\n */\n const PROJECT_ID = process.env.CAIP_PROJECT_ID;\n const LOCATION = process.env.LOCATION;\n const MODEL = 'gemini-2.0-flash-001';\n\n async function generateContent() {\n // Initialize Vertex AI\n const vertexAI = new https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/vertexai.html({project: PROJECT_ID, location: LOCATION});\n const generativeModel = vertexAI.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/vertexai.html({model: MODEL});\n\n const request = {\n contents: [\n {\n role: 'user',\n parts: [\n {\n file_data: {\n file_uri: 'gs://cloud-samples-data/video/animals.mp4',\n mime_type: 'video/mp4',\n },\n },\n {\n file_data: {\n file_uri:\n 'gs://cloud-samples-data/generative-ai/image/character.jpg',\n mime_type: 'image/jpeg',\n },\n },\n {text: 'Are this video and image correlated?'},\n ],\n },\n ],\n };\n\n const result = await generativeModel.generateContentStream(request);\n\n for await (const item of result.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/streamgeneratecontentresult.html) {\n console.log(item.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/generatecontentresponse.html[0].https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/generatecontentcandidate.html.https://cloud.google.com/nodejs/docs/reference/vertexai/latest/vertexai/content.html[0].text);\n }\n }\n\nWhat's next\n-----------\n\n\nTo search and filter code samples for other Google Cloud products, see the\n[Google Cloud sample browser](/docs/samples?product=generativeaionvertexai)."]]