Utilizing pretrained LLM

This page describes how to use a textual large-language model (LLM) in the custom recommendation models. We train these models for you. You can enable the pretrained features in the custom recommendation models.

Recommendations uses the product description field to feed to LLMs and put them into your recommendations models.

New LLM textual features

While it's possible to get text embeddings by manually configuring a Vertex AI generative model, you might want to integrate the new LLM capabilities into your recommendations models to improve performance.

The text embeddings are more descriptive, longer, and are not repetitive, as well as have multilingual interpretation capabilities. This feature is based on an allowlist. Contact support for enabling this feature.

There's no charge for using the text embeddings and they are included in Vertex AI Search pricing.

The LLM-pretrained embeddings improve semantic understanding of long form text searches such as descriptions.

See the following resources for more information on how to use embeddings and generative AI alone in your own custom ML training:

Model compatibility

The LLM feature is compatible with all ml model types and objectives, including:

  • OYML
  • FBT
  • and more.

For more information on the different types of recommendation models Vertex AI Search for commerce supports, see About recommendations models.