Gemini is a family of generative AI models developed by Google that is designed for multimodal use cases. If you haven't used Gemini models on Vertex AI before, see the Generative AI introduction.
Key advantages of Gemini include the following:
Enhanced performance: The latest large language models (LLMs), such as Gemini Flash 1.5, demonstrate better understanding across a range of natural language tasks than the AutoML text model. For more information, see the publicly available technical report from the Gemini team.
Flexibility: Gemini allows for both prompting (quick adaptation) and fine-tuning (deeper customization), catering to different project needs. This flexibility allows for rapid prototyping, testing, and deployment using prompting, with the option to fine-tune the Gemini model weights for optimal performance on specific tasks. Vertex AI offers both console-based fine-tuning and SDK and API options for programmatic control.
Multipurpose and multimodal capabilities: Gemini offers the ability to process text, images, and other modalities. This approach enables consistent use of a single format and model across various tasks. This flexibility allows the process to be easily adapted for different applications, streamlining and accelerating development efforts.
Gemini supports most features available in AutoML text. However, there are differences, and the client libraries don't support client integration backward compatibility. In other words, you must plan to migrate your resources to benefit from Gemini features.
If you are planning a new project, you should build your code, job, dataset, or model with Gemini. This lets you take advantage of the new features and service improvements as they become available.
Recommended steps for migrating to Gemini
Use the following recommended steps to update your existing code, jobs, datasets, and models from AutoML text to Gemini.
Read about the major differences between Gemini and AutoML text at Gemini for AutoML text users.
Review any potential changes in pricing (see Gemini migration pricing).
Take inventory of your Google Cloud projects, code, jobs, datasets, models, and users with access to AutoML text. Use this information to determine which resources to migrate and ensure that the correct users have access to the migrated resources.
Review any changes to IAM roles, and then update service accounts and authentication for your resources.
Migrate your resources using either of these two methods:
View the locations available for Gemini.
Identify usage of AutoML text APIs to help determine which of your applications use them and to identify the method calls that you want to migrate.
Update your applications and workflows to use Gemini.
Plan your request quota monitoring. See Quotas and limits.
Gemini migration pricing
Migration is free. After migration, legacy resources are still available to use in AutoML text until the service shuts down in June 2025. To avoid unnecessary costs, shut down or delete legacy resources after you have verified that your objects have migrated successfully.
Gemini pricing compared to AutoML text pricing
Gemini pricing is generally cheaper compared to equivalent tasks in AutoML text. Gemini pricing is determined by whether you're using the model for prompt engineering only, fine-tuning only, or a combination of both. For more information, you can compare AutoML text pricing with Gemini pricing.
For entity extraction models, consider that the model serving output may be higher as the output is the complete structured data.
Identify usage of AutoML text APIs
You can determine which of your applications use AutoML APIs, as well as which methods they are using. Use this information to help determine whether these API calls need to be migrated to Gemini:
For each of your projects, go to the APIs & Services Dashboard to see a list of which products' API the project uses. To learn more, see Monitoring API usage.
If enabled, you can check the audit logs created by AutoML text as part of Cloud Audit Logs.
To see usage of specific AutoML text methods, go to the AutoML text Metrics page.
Manage changes to IAM roles and permissions
Vertex AI provides the following Identity and Access Management (IAM) roles:
aiplatform.admin
aiplatform.user
aiplatform.viewer
Using Vertex AI datasets is no longer required. Data for fine-tuning in Gemini can only be stored in Cloud Storage.
For more information on IAM roles, see Access control.
What's next
Read Introduction to tuning in Gemini.
See Gemini for AutoML text users for a comparison of Gemini and AutoML text.